Dec 01 15:00:52 crc systemd[1]: Starting Kubernetes Kubelet... Dec 01 15:00:52 crc restorecon[4680]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:52 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 15:00:53 crc restorecon[4680]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 15:00:53 crc restorecon[4680]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 01 15:00:54 crc kubenswrapper[4931]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 15:00:54 crc kubenswrapper[4931]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 01 15:00:54 crc kubenswrapper[4931]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 15:00:54 crc kubenswrapper[4931]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 15:00:54 crc kubenswrapper[4931]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 01 15:00:54 crc kubenswrapper[4931]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.020880 4931 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027566 4931 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027597 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027607 4931 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027616 4931 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027628 4931 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027638 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027647 4931 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027657 4931 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027667 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027675 4931 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027685 4931 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027695 4931 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027705 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027715 4931 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027724 4931 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027732 4931 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027740 4931 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027749 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027757 4931 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027767 4931 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027777 4931 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027785 4931 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027793 4931 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027804 4931 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027813 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027821 4931 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027829 4931 feature_gate.go:330] unrecognized feature gate: Example Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027837 4931 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027846 4931 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027856 4931 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027864 4931 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027871 4931 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027879 4931 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027888 4931 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027896 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027904 4931 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027912 4931 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027919 4931 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027927 4931 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027935 4931 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027946 4931 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027953 4931 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027961 4931 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027972 4931 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027981 4931 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.027991 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028001 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028012 4931 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028022 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028033 4931 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028043 4931 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028052 4931 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028062 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028073 4931 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028082 4931 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028092 4931 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028100 4931 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028108 4931 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028115 4931 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028125 4931 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028133 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028140 4931 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028148 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028155 4931 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028162 4931 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028170 4931 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028178 4931 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028186 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028193 4931 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028200 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.028208 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028447 4931 flags.go:64] FLAG: --address="0.0.0.0" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028466 4931 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028482 4931 flags.go:64] FLAG: --anonymous-auth="true" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028493 4931 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028505 4931 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028535 4931 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028547 4931 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028559 4931 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028568 4931 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028578 4931 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028587 4931 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028599 4931 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028608 4931 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028618 4931 flags.go:64] FLAG: --cgroup-root="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028626 4931 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028635 4931 flags.go:64] FLAG: --client-ca-file="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028644 4931 flags.go:64] FLAG: --cloud-config="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028653 4931 flags.go:64] FLAG: --cloud-provider="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028661 4931 flags.go:64] FLAG: --cluster-dns="[]" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028673 4931 flags.go:64] FLAG: --cluster-domain="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028682 4931 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028692 4931 flags.go:64] FLAG: --config-dir="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028701 4931 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028710 4931 flags.go:64] FLAG: --container-log-max-files="5" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028723 4931 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028733 4931 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028742 4931 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028751 4931 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028760 4931 flags.go:64] FLAG: --contention-profiling="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028769 4931 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028778 4931 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028788 4931 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028797 4931 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028808 4931 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028817 4931 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028827 4931 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028836 4931 flags.go:64] FLAG: --enable-load-reader="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028845 4931 flags.go:64] FLAG: --enable-server="true" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028853 4931 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028865 4931 flags.go:64] FLAG: --event-burst="100" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028874 4931 flags.go:64] FLAG: --event-qps="50" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028884 4931 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028893 4931 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028902 4931 flags.go:64] FLAG: --eviction-hard="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028913 4931 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028922 4931 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028931 4931 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028941 4931 flags.go:64] FLAG: --eviction-soft="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028950 4931 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028959 4931 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028967 4931 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028977 4931 flags.go:64] FLAG: --experimental-mounter-path="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028986 4931 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.028995 4931 flags.go:64] FLAG: --fail-swap-on="true" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029004 4931 flags.go:64] FLAG: --feature-gates="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029015 4931 flags.go:64] FLAG: --file-check-frequency="20s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029025 4931 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029035 4931 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029044 4931 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029054 4931 flags.go:64] FLAG: --healthz-port="10248" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029064 4931 flags.go:64] FLAG: --help="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029072 4931 flags.go:64] FLAG: --hostname-override="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029081 4931 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029090 4931 flags.go:64] FLAG: --http-check-frequency="20s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029100 4931 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029108 4931 flags.go:64] FLAG: --image-credential-provider-config="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029117 4931 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029126 4931 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029135 4931 flags.go:64] FLAG: --image-service-endpoint="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029143 4931 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029152 4931 flags.go:64] FLAG: --kube-api-burst="100" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029161 4931 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029171 4931 flags.go:64] FLAG: --kube-api-qps="50" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029180 4931 flags.go:64] FLAG: --kube-reserved="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029188 4931 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029197 4931 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029207 4931 flags.go:64] FLAG: --kubelet-cgroups="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029216 4931 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029225 4931 flags.go:64] FLAG: --lock-file="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029234 4931 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029243 4931 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029252 4931 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029268 4931 flags.go:64] FLAG: --log-json-split-stream="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029279 4931 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029290 4931 flags.go:64] FLAG: --log-text-split-stream="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029300 4931 flags.go:64] FLAG: --logging-format="text" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029310 4931 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029321 4931 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029332 4931 flags.go:64] FLAG: --manifest-url="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029342 4931 flags.go:64] FLAG: --manifest-url-header="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029356 4931 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029366 4931 flags.go:64] FLAG: --max-open-files="1000000" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029380 4931 flags.go:64] FLAG: --max-pods="110" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029426 4931 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029437 4931 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029448 4931 flags.go:64] FLAG: --memory-manager-policy="None" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029467 4931 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029479 4931 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029490 4931 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029501 4931 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029525 4931 flags.go:64] FLAG: --node-status-max-images="50" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029536 4931 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029549 4931 flags.go:64] FLAG: --oom-score-adj="-999" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029561 4931 flags.go:64] FLAG: --pod-cidr="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029572 4931 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029590 4931 flags.go:64] FLAG: --pod-manifest-path="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029683 4931 flags.go:64] FLAG: --pod-max-pids="-1" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029694 4931 flags.go:64] FLAG: --pods-per-core="0" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029704 4931 flags.go:64] FLAG: --port="10250" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029714 4931 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029723 4931 flags.go:64] FLAG: --provider-id="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029732 4931 flags.go:64] FLAG: --qos-reserved="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029741 4931 flags.go:64] FLAG: --read-only-port="10255" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029750 4931 flags.go:64] FLAG: --register-node="true" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029759 4931 flags.go:64] FLAG: --register-schedulable="true" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029769 4931 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029784 4931 flags.go:64] FLAG: --registry-burst="10" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029793 4931 flags.go:64] FLAG: --registry-qps="5" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029803 4931 flags.go:64] FLAG: --reserved-cpus="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029813 4931 flags.go:64] FLAG: --reserved-memory="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029862 4931 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029871 4931 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029881 4931 flags.go:64] FLAG: --rotate-certificates="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029890 4931 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029899 4931 flags.go:64] FLAG: --runonce="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029908 4931 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029917 4931 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029927 4931 flags.go:64] FLAG: --seccomp-default="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029936 4931 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029945 4931 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029955 4931 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029978 4931 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029990 4931 flags.go:64] FLAG: --storage-driver-password="root" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.029999 4931 flags.go:64] FLAG: --storage-driver-secure="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030009 4931 flags.go:64] FLAG: --storage-driver-table="stats" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030017 4931 flags.go:64] FLAG: --storage-driver-user="root" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030026 4931 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030035 4931 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030045 4931 flags.go:64] FLAG: --system-cgroups="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030054 4931 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030068 4931 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030077 4931 flags.go:64] FLAG: --tls-cert-file="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030086 4931 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030098 4931 flags.go:64] FLAG: --tls-min-version="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030108 4931 flags.go:64] FLAG: --tls-private-key-file="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030117 4931 flags.go:64] FLAG: --topology-manager-policy="none" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030126 4931 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030135 4931 flags.go:64] FLAG: --topology-manager-scope="container" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030144 4931 flags.go:64] FLAG: --v="2" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030162 4931 flags.go:64] FLAG: --version="false" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030174 4931 flags.go:64] FLAG: --vmodule="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030185 4931 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.030194 4931 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030489 4931 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030501 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030510 4931 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030518 4931 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030526 4931 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030533 4931 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030556 4931 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030566 4931 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030575 4931 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030583 4931 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030595 4931 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030605 4931 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030614 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030623 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030632 4931 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030641 4931 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030649 4931 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030658 4931 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030671 4931 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030682 4931 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030692 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030702 4931 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030712 4931 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030721 4931 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030732 4931 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030741 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030749 4931 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030757 4931 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030766 4931 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030773 4931 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030782 4931 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030791 4931 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030800 4931 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030808 4931 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030816 4931 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030827 4931 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030836 4931 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030845 4931 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030856 4931 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030864 4931 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030873 4931 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030881 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030889 4931 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030896 4931 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030904 4931 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030912 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030920 4931 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030927 4931 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030935 4931 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030943 4931 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030950 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030959 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030967 4931 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030975 4931 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030982 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030991 4931 feature_gate.go:330] unrecognized feature gate: Example Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.030998 4931 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031006 4931 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031014 4931 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031021 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031029 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031037 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031045 4931 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031052 4931 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031059 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031067 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031075 4931 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031082 4931 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031090 4931 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031098 4931 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.031109 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.031121 4931 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.042165 4931 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.042197 4931 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042318 4931 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042333 4931 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042345 4931 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042354 4931 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042363 4931 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042373 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042382 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042418 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042427 4931 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042434 4931 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042443 4931 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042451 4931 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042459 4931 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042466 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042474 4931 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042482 4931 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042489 4931 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042497 4931 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042505 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042512 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042522 4931 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042532 4931 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042541 4931 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042550 4931 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042559 4931 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042567 4931 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042575 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042583 4931 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042591 4931 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042599 4931 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042606 4931 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042617 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042624 4931 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042632 4931 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042640 4931 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042648 4931 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042656 4931 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042664 4931 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042671 4931 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042679 4931 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042686 4931 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042694 4931 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042717 4931 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042724 4931 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042732 4931 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042739 4931 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042747 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042755 4931 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042763 4931 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042770 4931 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042777 4931 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042785 4931 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042793 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042800 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042807 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042815 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042822 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042830 4931 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042837 4931 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042845 4931 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042852 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042860 4931 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042868 4931 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042876 4931 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042884 4931 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042891 4931 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042901 4931 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042910 4931 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042918 4931 feature_gate.go:330] unrecognized feature gate: Example Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042926 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.042934 4931 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.042946 4931 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043211 4931 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043224 4931 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043234 4931 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043242 4931 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043250 4931 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043258 4931 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043277 4931 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043288 4931 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043297 4931 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043305 4931 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043312 4931 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043320 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043328 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043336 4931 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043344 4931 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043351 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043359 4931 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043367 4931 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043374 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043382 4931 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043417 4931 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043425 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043435 4931 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043443 4931 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043450 4931 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043458 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043466 4931 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043473 4931 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043482 4931 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043490 4931 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043499 4931 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043508 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043516 4931 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043525 4931 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043533 4931 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043540 4931 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043549 4931 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043557 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043565 4931 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043573 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043581 4931 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043588 4931 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043609 4931 feature_gate.go:330] unrecognized feature gate: Example Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043617 4931 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043624 4931 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043632 4931 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043639 4931 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043647 4931 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043654 4931 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043662 4931 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043670 4931 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043677 4931 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043685 4931 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043693 4931 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043700 4931 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043710 4931 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043720 4931 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043730 4931 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043739 4931 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043749 4931 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043758 4931 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043767 4931 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043775 4931 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043785 4931 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043795 4931 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043804 4931 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043813 4931 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043822 4931 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043830 4931 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043839 4931 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.043847 4931 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.043858 4931 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.044343 4931 server.go:940] "Client rotation is on, will bootstrap in background" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.049207 4931 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.049323 4931 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.050218 4931 server.go:997] "Starting client certificate rotation" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.050257 4931 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.050717 4931 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-13 07:03:20.39832334 +0000 UTC Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.050812 4931 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 280h2m26.347530332s for next certificate rotation Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.062708 4931 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.066447 4931 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.086594 4931 log.go:25] "Validated CRI v1 runtime API" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.121042 4931 log.go:25] "Validated CRI v1 image API" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.123128 4931 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.127133 4931 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-01-14-56-42-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.127181 4931 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.154701 4931 manager.go:217] Machine: {Timestamp:2025-12-01 15:00:54.152154246 +0000 UTC m=+0.578027973 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a263e267-40f6-4472-9fe3-92cd328d0ad9 BootID:2309286a-3bdf-4d90-8920-f6c1244ed71c Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:f2:f3:66 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:f2:f3:66 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:40:12:ed Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:d1:e2:04 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a2:76:6e Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:58:09:74 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:36:a9:4e:95:3c:da Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3a:69:d2:70:41:dd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.155177 4931 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.155434 4931 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.157382 4931 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.157807 4931 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.157864 4931 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.158216 4931 topology_manager.go:138] "Creating topology manager with none policy" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.158237 4931 container_manager_linux.go:303] "Creating device plugin manager" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.158602 4931 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.158666 4931 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.158925 4931 state_mem.go:36] "Initialized new in-memory state store" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.159548 4931 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.160551 4931 kubelet.go:418] "Attempting to sync node with API server" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.160590 4931 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.160632 4931 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.160654 4931 kubelet.go:324] "Adding apiserver pod source" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.160675 4931 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.164115 4931 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.164596 4931 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.165388 4931 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.166239 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.166282 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.166297 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.166311 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.166333 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.166347 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.166359 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.166382 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.166438 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.166453 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.166518 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.166532 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.167091 4931 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.167787 4931 server.go:1280] "Started kubelet" Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.168827 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.170677 4931 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 01 15:00:54 crc systemd[1]: Started Kubernetes Kubelet. Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.172074 4931 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.172264 4931 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.172254 4931 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.172337 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.172432 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.170654 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.173565 4931 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.2:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d1f7dcce0a21e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 15:00:54.16774915 +0000 UTC m=+0.593622847,LastTimestamp:2025-12-01 15:00:54.16774915 +0000 UTC m=+0.593622847,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.175567 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.176914 4931 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.177659 4931 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.177679 4931 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.177667 4931 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.177713 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 18:59:53.46933091 +0000 UTC Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.177768 4931 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 987h58m59.291569041s for next certificate rotation Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.177821 4931 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.178251 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="200ms" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.178514 4931 server.go:460] "Adding debug handlers to kubelet server" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.180014 4931 factory.go:55] Registering systemd factory Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.180040 4931 factory.go:221] Registration of the systemd container factory successfully Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.180363 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.180490 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.180584 4931 factory.go:153] Registering CRI-O factory Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.180627 4931 factory.go:221] Registration of the crio container factory successfully Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.180736 4931 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.180780 4931 factory.go:103] Registering Raw factory Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.180808 4931 manager.go:1196] Started watching for new ooms in manager Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.182240 4931 manager.go:319] Starting recovery of all containers Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.194092 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.194451 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.194659 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.194815 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.194948 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.195074 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.195236 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.195437 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.195591 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.195726 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.195904 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.196072 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.196196 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.196332 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.196492 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.196967 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.197187 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.197330 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.197502 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.197635 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.197779 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.197901 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.198039 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.198169 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.198339 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.198540 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.198681 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.198835 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.198965 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.199095 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.199213 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.199339 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.199666 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.199819 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.199951 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.200072 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.200198 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.200954 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.201120 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.201270 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.201424 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.201552 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.201681 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.201954 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202152 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202292 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202448 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202627 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202695 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202774 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202792 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202816 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202845 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202871 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202900 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202917 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202947 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202966 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.202985 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203000 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203016 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203030 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203044 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203062 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203076 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203093 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203106 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203119 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203134 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203147 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203167 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203179 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203193 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203212 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203229 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203253 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203271 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203290 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203313 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203329 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203345 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203362 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203376 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203414 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203428 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203453 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203471 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203484 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203504 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203518 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203532 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203550 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203564 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203580 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.203596 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204201 4931 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204238 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204253 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204267 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204282 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204327 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204343 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204357 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204373 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204469 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204500 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204515 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204533 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204550 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204565 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204580 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204592 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204609 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204625 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204640 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204655 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204669 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204685 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204697 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204708 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204724 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204735 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204750 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204762 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204774 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204788 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204799 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204810 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204824 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204837 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204852 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204865 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204876 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204891 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204903 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204916 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204929 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204943 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204959 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204969 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204983 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.204995 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205007 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205022 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205036 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205093 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205111 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205125 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205136 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205148 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205161 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205172 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205187 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205199 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205210 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205224 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205235 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205248 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205259 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205272 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205285 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205319 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205336 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205346 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205359 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205373 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205388 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205415 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205428 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205440 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205453 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205495 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205512 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205534 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205550 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205568 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205587 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205599 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205614 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205626 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205639 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205652 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205663 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205692 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205724 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205736 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205750 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205762 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.205777 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206738 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206752 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206764 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206775 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206787 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206799 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206810 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206822 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206833 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206844 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206855 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206873 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206884 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206894 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206905 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206916 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206928 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206942 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206955 4931 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206965 4931 reconstruct.go:97] "Volume reconstruction finished" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.206973 4931 reconciler.go:26] "Reconciler: start to sync state" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.215589 4931 manager.go:324] Recovery completed Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.232730 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.235523 4931 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.235608 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.235641 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.235657 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.236835 4931 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.236880 4931 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.236915 4931 state_mem.go:36] "Initialized new in-memory state store" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.240129 4931 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.240176 4931 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.240196 4931 kubelet.go:2335] "Starting kubelet main sync loop" Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.240234 4931 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.241294 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.241414 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.252029 4931 policy_none.go:49] "None policy: Start" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.253748 4931 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.253790 4931 state_mem.go:35] "Initializing new in-memory state store" Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.277830 4931 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.318711 4931 manager.go:334] "Starting Device Plugin manager" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.318797 4931 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.318812 4931 server.go:79] "Starting device plugin registration server" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.319304 4931 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.319324 4931 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.319666 4931 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.319771 4931 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.319780 4931 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.328891 4931 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.341042 4931 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.341161 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.342271 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.342315 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.342329 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.342630 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.342918 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.342989 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.343499 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.343575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.343591 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.343744 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.343831 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.343852 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.343919 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.343938 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.343946 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.344658 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.344691 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.344700 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.344729 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.344779 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.344797 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.345046 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.345144 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.345179 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.346680 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.346710 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.346723 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.346992 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.347028 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.347038 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.347221 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.347322 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.347351 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.348118 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.348131 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.348139 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.348147 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.348159 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.348150 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.348426 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.348462 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.349217 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.349236 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.349246 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.379092 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="400ms" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.410070 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.410170 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.410222 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.410263 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.410336 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.410489 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.410627 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.410700 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.410783 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.410835 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.410921 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.411027 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.411079 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.411114 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.411166 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.420767 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.422853 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.422942 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.422966 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.423011 4931 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.423820 4931 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.2:6443: connect: connection refused" node="crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513080 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513155 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513180 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513208 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513231 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513254 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513279 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513304 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513329 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513315 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513353 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513375 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513425 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513445 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513374 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513509 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513456 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513544 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513546 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513510 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513568 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513467 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513584 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513648 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513622 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513763 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513818 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513840 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513889 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.513995 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.624845 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.626222 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.626273 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.626288 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.626315 4931 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.626780 4931 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.2:6443: connect: connection refused" node="crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.675584 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.694242 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.708479 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-4537ebf4cd413a5f73e99ef36f13cb53412101999952937003519faa4957507b WatchSource:0}: Error finding container 4537ebf4cd413a5f73e99ef36f13cb53412101999952937003519faa4957507b: Status 404 returned error can't find the container with id 4537ebf4cd413a5f73e99ef36f13cb53412101999952937003519faa4957507b Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.715144 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.721108 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-35356ca891282a0002e14a6607768ee4e434597e70fd940196fa9fa26f87094d WatchSource:0}: Error finding container 35356ca891282a0002e14a6607768ee4e434597e70fd940196fa9fa26f87094d: Status 404 returned error can't find the container with id 35356ca891282a0002e14a6607768ee4e434597e70fd940196fa9fa26f87094d Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.726471 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.728013 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-6e80dd74c8758270d762545c5800d85adcdc40de264ab1d9e61391dfce0a8a59 WatchSource:0}: Error finding container 6e80dd74c8758270d762545c5800d85adcdc40de264ab1d9e61391dfce0a8a59: Status 404 returned error can't find the container with id 6e80dd74c8758270d762545c5800d85adcdc40de264ab1d9e61391dfce0a8a59 Dec 01 15:00:54 crc kubenswrapper[4931]: I1201 15:00:54.731782 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.738413 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0de06198a85f526ee20ea8370d7618f8e2f9dc6992abd6b40bf24188351caa20 WatchSource:0}: Error finding container 0de06198a85f526ee20ea8370d7618f8e2f9dc6992abd6b40bf24188351caa20: Status 404 returned error can't find the container with id 0de06198a85f526ee20ea8370d7618f8e2f9dc6992abd6b40bf24188351caa20 Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.752309 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-97e4e5a8cbbc23ae526907fa2af0522b7aeca7c0e751dba5f26608ae14b13292 WatchSource:0}: Error finding container 97e4e5a8cbbc23ae526907fa2af0522b7aeca7c0e751dba5f26608ae14b13292: Status 404 returned error can't find the container with id 97e4e5a8cbbc23ae526907fa2af0522b7aeca7c0e751dba5f26608ae14b13292 Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.780647 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="800ms" Dec 01 15:00:54 crc kubenswrapper[4931]: W1201 15:00:54.991214 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 01 15:00:54 crc kubenswrapper[4931]: E1201 15:00:54.991343 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 01 15:00:55 crc kubenswrapper[4931]: W1201 15:00:55.008380 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 01 15:00:55 crc kubenswrapper[4931]: E1201 15:00:55.008487 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.027751 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.029503 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.029549 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.029567 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.029600 4931 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 15:00:55 crc kubenswrapper[4931]: E1201 15:00:55.030010 4931 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.2:6443: connect: connection refused" node="crc" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.173651 4931 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 01 15:00:55 crc kubenswrapper[4931]: W1201 15:00:55.241930 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 01 15:00:55 crc kubenswrapper[4931]: E1201 15:00:55.242044 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.247416 4931 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb" exitCode=0 Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.247483 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb"} Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.247660 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4537ebf4cd413a5f73e99ef36f13cb53412101999952937003519faa4957507b"} Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.247829 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.249347 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.249412 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d"} Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.249450 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"97e4e5a8cbbc23ae526907fa2af0522b7aeca7c0e751dba5f26608ae14b13292"} Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.249429 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.249487 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.250664 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf" exitCode=0 Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.250684 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf"} Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.250723 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0de06198a85f526ee20ea8370d7618f8e2f9dc6992abd6b40bf24188351caa20"} Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.250832 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.252466 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.252504 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.252519 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.252891 4931 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf" exitCode=0 Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.252952 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf"} Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.252977 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6e80dd74c8758270d762545c5800d85adcdc40de264ab1d9e61391dfce0a8a59"} Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.253092 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.255044 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.255316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.255946 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.255969 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.257532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.257577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.257593 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.257593 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"266644e388b0a2b662912f43197e958f6f3d51c4a14d9dc615bc2ab644a35cd7"} Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.257705 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.257310 4931 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="266644e388b0a2b662912f43197e958f6f3d51c4a14d9dc615bc2ab644a35cd7" exitCode=0 Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.258241 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"35356ca891282a0002e14a6607768ee4e434597e70fd940196fa9fa26f87094d"} Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.258772 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.258815 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.258834 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:55 crc kubenswrapper[4931]: E1201 15:00:55.581596 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="1.6s" Dec 01 15:00:55 crc kubenswrapper[4931]: W1201 15:00:55.666926 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 01 15:00:55 crc kubenswrapper[4931]: E1201 15:00:55.667015 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.830485 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.832075 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.832110 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.832121 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:55 crc kubenswrapper[4931]: I1201 15:00:55.832146 4931 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 15:00:55 crc kubenswrapper[4931]: E1201 15:00:55.832635 4931 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.2:6443: connect: connection refused" node="crc" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.173274 4931 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.266532 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c"} Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.266585 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a"} Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.266602 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf"} Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.266612 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55"} Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.270033 4931 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1" exitCode=0 Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.270095 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1"} Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.270248 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.272340 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.272362 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.272372 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.273728 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"825f8f254baad923387374c5af4df73ee3517dbe50ec03d4ab824d260692d4b2"} Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.273908 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.275023 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.275068 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.275083 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.277190 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c"} Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.277246 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb"} Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.277256 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415"} Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.277365 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.278437 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.278496 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.278513 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.281997 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d"} Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.282058 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022"} Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.282079 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043"} Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.282117 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.283642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.283700 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.283714 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:56 crc kubenswrapper[4931]: I1201 15:00:56.497708 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:00:56 crc kubenswrapper[4931]: W1201 15:00:56.701424 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 01 15:00:56 crc kubenswrapper[4931]: E1201 15:00:56.701567 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 01 15:00:56 crc kubenswrapper[4931]: W1201 15:00:56.893810 4931 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 01 15:00:56 crc kubenswrapper[4931]: E1201 15:00:56.893950 4931 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.289010 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7"} Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.289232 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.290623 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.290654 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.290665 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.292268 4931 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9" exitCode=0 Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.292319 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9"} Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.292354 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.292473 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.292594 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.293540 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.293576 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.293590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.293789 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.293904 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.293919 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.294510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.294528 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.294538 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.432812 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.434819 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.434887 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.434906 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:57 crc kubenswrapper[4931]: I1201 15:00:57.434950 4931 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 15:00:58 crc kubenswrapper[4931]: I1201 15:00:58.299279 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310"} Dec 01 15:00:58 crc kubenswrapper[4931]: I1201 15:00:58.299765 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0"} Dec 01 15:00:58 crc kubenswrapper[4931]: I1201 15:00:58.299791 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:00:58 crc kubenswrapper[4931]: I1201 15:00:58.299806 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b"} Dec 01 15:00:58 crc kubenswrapper[4931]: I1201 15:00:58.299505 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:58 crc kubenswrapper[4931]: I1201 15:00:58.299311 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:58 crc kubenswrapper[4931]: I1201 15:00:58.301113 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:58 crc kubenswrapper[4931]: I1201 15:00:58.301124 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:58 crc kubenswrapper[4931]: I1201 15:00:58.301137 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:58 crc kubenswrapper[4931]: I1201 15:00:58.301148 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:58 crc kubenswrapper[4931]: I1201 15:00:58.301153 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:58 crc kubenswrapper[4931]: I1201 15:00:58.301168 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:58 crc kubenswrapper[4931]: I1201 15:00:58.896175 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:00:59 crc kubenswrapper[4931]: I1201 15:00:59.308413 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754"} Dec 01 15:00:59 crc kubenswrapper[4931]: I1201 15:00:59.308478 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55"} Dec 01 15:00:59 crc kubenswrapper[4931]: I1201 15:00:59.308496 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:59 crc kubenswrapper[4931]: I1201 15:00:59.308639 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:00:59 crc kubenswrapper[4931]: I1201 15:00:59.309631 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:59 crc kubenswrapper[4931]: I1201 15:00:59.309666 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:59 crc kubenswrapper[4931]: I1201 15:00:59.309677 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:59 crc kubenswrapper[4931]: I1201 15:00:59.310339 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:00:59 crc kubenswrapper[4931]: I1201 15:00:59.310443 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:00:59 crc kubenswrapper[4931]: I1201 15:00:59.310470 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:00:59 crc kubenswrapper[4931]: I1201 15:00:59.498499 4931 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 15:00:59 crc kubenswrapper[4931]: I1201 15:00:59.498648 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 15:01:00 crc kubenswrapper[4931]: I1201 15:01:00.311141 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:01:00 crc kubenswrapper[4931]: I1201 15:01:00.311249 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:01:00 crc kubenswrapper[4931]: I1201 15:01:00.312585 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:00 crc kubenswrapper[4931]: I1201 15:01:00.312625 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:00 crc kubenswrapper[4931]: I1201 15:01:00.312643 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:00 crc kubenswrapper[4931]: I1201 15:01:00.312782 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:00 crc kubenswrapper[4931]: I1201 15:01:00.312825 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:00 crc kubenswrapper[4931]: I1201 15:01:00.312842 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.326582 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.326839 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.328339 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.328420 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.328435 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.334706 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.389343 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.389661 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.391205 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.391263 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.391282 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.397112 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.496254 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.496588 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.498131 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.498178 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:01 crc kubenswrapper[4931]: I1201 15:01:01.498189 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:02 crc kubenswrapper[4931]: I1201 15:01:02.316380 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:01:02 crc kubenswrapper[4931]: I1201 15:01:02.316455 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:01:02 crc kubenswrapper[4931]: I1201 15:01:02.317807 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:02 crc kubenswrapper[4931]: I1201 15:01:02.317850 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:02 crc kubenswrapper[4931]: I1201 15:01:02.317869 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:02 crc kubenswrapper[4931]: I1201 15:01:02.738542 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 01 15:01:02 crc kubenswrapper[4931]: I1201 15:01:02.738800 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:01:02 crc kubenswrapper[4931]: I1201 15:01:02.740300 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:02 crc kubenswrapper[4931]: I1201 15:01:02.740352 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:02 crc kubenswrapper[4931]: I1201 15:01:02.740369 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:03 crc kubenswrapper[4931]: I1201 15:01:03.319109 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:01:03 crc kubenswrapper[4931]: I1201 15:01:03.320280 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:03 crc kubenswrapper[4931]: I1201 15:01:03.320360 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:03 crc kubenswrapper[4931]: I1201 15:01:03.320423 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:04 crc kubenswrapper[4931]: E1201 15:01:04.329011 4931 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 15:01:05 crc kubenswrapper[4931]: I1201 15:01:05.015625 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 01 15:01:05 crc kubenswrapper[4931]: I1201 15:01:05.015917 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:01:05 crc kubenswrapper[4931]: I1201 15:01:05.017832 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:05 crc kubenswrapper[4931]: I1201 15:01:05.017907 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:05 crc kubenswrapper[4931]: I1201 15:01:05.017927 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:05 crc kubenswrapper[4931]: I1201 15:01:05.561002 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:01:05 crc kubenswrapper[4931]: I1201 15:01:05.562092 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:01:05 crc kubenswrapper[4931]: I1201 15:01:05.563329 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:05 crc kubenswrapper[4931]: I1201 15:01:05.563438 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:05 crc kubenswrapper[4931]: I1201 15:01:05.563505 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:07 crc kubenswrapper[4931]: I1201 15:01:07.102182 4931 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 15:01:07 crc kubenswrapper[4931]: I1201 15:01:07.102257 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 15:01:07 crc kubenswrapper[4931]: I1201 15:01:07.107651 4931 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 15:01:07 crc kubenswrapper[4931]: I1201 15:01:07.107730 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 15:01:09 crc kubenswrapper[4931]: I1201 15:01:09.498793 4931 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 15:01:09 crc kubenswrapper[4931]: I1201 15:01:09.498908 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 15:01:11 crc kubenswrapper[4931]: I1201 15:01:11.504127 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:01:11 crc kubenswrapper[4931]: I1201 15:01:11.504381 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:01:11 crc kubenswrapper[4931]: I1201 15:01:11.506991 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:11 crc kubenswrapper[4931]: I1201 15:01:11.507062 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:11 crc kubenswrapper[4931]: I1201 15:01:11.507083 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:11 crc kubenswrapper[4931]: I1201 15:01:11.511852 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.082590 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.084829 4931 trace.go:236] Trace[1388669314]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 15:00:57.833) (total time: 14251ms): Dec 01 15:01:12 crc kubenswrapper[4931]: Trace[1388669314]: ---"Objects listed" error: 14251ms (15:01:12.084) Dec 01 15:01:12 crc kubenswrapper[4931]: Trace[1388669314]: [14.251147248s] [14.251147248s] END Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.084851 4931 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.096077 4931 trace.go:236] Trace[800848032]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 15:01:01.271) (total time: 10824ms): Dec 01 15:01:12 crc kubenswrapper[4931]: Trace[800848032]: ---"Objects listed" error: 10824ms (15:01:12.095) Dec 01 15:01:12 crc kubenswrapper[4931]: Trace[800848032]: [10.824296187s] [10.824296187s] END Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.096115 4931 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.096650 4931 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.098232 4931 trace.go:236] Trace[1766843176]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 15:00:58.112) (total time: 13986ms): Dec 01 15:01:12 crc kubenswrapper[4931]: Trace[1766843176]: ---"Objects listed" error: 13985ms (15:01:12.097) Dec 01 15:01:12 crc kubenswrapper[4931]: Trace[1766843176]: [13.986010986s] [13.986010986s] END Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.098275 4931 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.102767 4931 trace.go:236] Trace[1009037645]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 15:01:01.127) (total time: 10975ms): Dec 01 15:01:12 crc kubenswrapper[4931]: Trace[1009037645]: ---"Objects listed" error: 10975ms (15:01:12.102) Dec 01 15:01:12 crc kubenswrapper[4931]: Trace[1009037645]: [10.975323297s] [10.975323297s] END Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.102820 4931 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.165875 4931 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.172187 4931 apiserver.go:52] "Watching apiserver" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.175228 4931 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.175565 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.176151 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.176249 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.176347 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.176441 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.176458 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.176609 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.176625 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.176833 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.176884 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.178689 4931 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.179330 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.179606 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.180039 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.180080 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.180040 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.180080 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.180160 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.180202 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.182883 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.210580 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.222977 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.238613 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.264182 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266568 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266640 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266674 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266711 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266732 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266757 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266780 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266805 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266828 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266852 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266879 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266907 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266932 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266960 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.266987 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267012 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267035 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267061 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267088 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267114 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267140 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267128 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267169 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267198 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267222 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267247 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267272 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267295 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267275 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267318 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267345 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267352 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267371 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267417 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267452 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267483 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267505 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267528 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267550 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267584 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267611 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267634 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267664 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267686 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267689 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267713 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267751 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267779 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267805 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267828 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267850 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267853 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267874 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267907 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267923 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267911 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267945 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267992 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268018 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268042 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268094 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268121 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268147 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268172 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268196 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268225 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268249 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268282 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268307 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268332 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268360 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268410 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268439 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268465 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268489 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268514 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268546 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268578 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268603 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268628 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268658 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268694 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268733 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268758 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268782 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268813 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268844 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268871 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268893 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268921 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268949 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268973 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268997 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269020 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269045 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269068 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269093 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269116 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269141 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269165 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269191 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269217 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269242 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269266 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269289 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269314 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269337 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269360 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269402 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269430 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269455 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269479 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269505 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269530 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269555 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269581 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269604 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269630 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269655 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269679 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269702 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269731 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269755 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269779 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269802 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269826 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269849 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269874 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269896 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269919 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269944 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269971 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269997 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270022 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270047 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270070 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270093 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270120 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270145 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270172 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270197 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270225 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270249 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270273 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270297 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270322 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270374 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270417 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270445 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270474 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270506 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270531 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270629 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270656 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270684 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270710 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270736 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270762 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270791 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270819 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270842 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270866 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270891 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270917 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270939 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270966 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.267943 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270994 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268121 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271020 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271049 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271073 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271092 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271112 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271131 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271151 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271170 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271195 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271221 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271242 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271260 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271277 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271295 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271313 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271434 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271461 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271480 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271502 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271522 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271541 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271562 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271585 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271616 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271640 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271658 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271676 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271697 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271715 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271735 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271753 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271773 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271791 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271809 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271828 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271878 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271913 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271950 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271971 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271995 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272014 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272035 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272059 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272079 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272099 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272121 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272140 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272162 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272182 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272241 4931 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272253 4931 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272265 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272278 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272289 4931 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272298 4931 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272310 4931 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272321 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272333 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.273421 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.277750 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.279247 4931 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.279727 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268193 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268322 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268323 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268650 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268659 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268741 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268758 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268956 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.268989 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269006 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269061 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269123 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269149 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269189 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269296 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269368 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269410 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269445 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269493 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269509 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.269682 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270004 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270015 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270111 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270281 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270400 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270523 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270526 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270754 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270964 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270983 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271176 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271309 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271279 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271402 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271425 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271544 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.270912 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271673 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271956 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.271977 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272106 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272487 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272550 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272582 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272666 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.272938 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.273197 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.273344 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.273218 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.273585 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.273632 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.273664 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.274074 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.274175 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.274217 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.274265 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.274306 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.274884 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.275059 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.275097 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.275120 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.275231 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.275250 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.275462 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.275514 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.275533 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.275656 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.275762 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.275908 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.275793 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.276142 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.276177 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.276257 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.276368 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.276756 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.276777 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.277005 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.277036 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.277040 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.277004 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.277197 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.277336 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.277363 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.277527 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.277648 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.284947 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.284968 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:12.784944096 +0000 UTC m=+19.210817763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.277939 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.278305 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.278311 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.278576 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.278605 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.278737 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.279670 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.279949 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.285334 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.285573 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.285646 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.285755 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.285771 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.285821 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.285976 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.286012 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.280275 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.280356 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.280610 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.280836 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.286264 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:12.786242904 +0000 UTC m=+19.212116721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.282107 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.283943 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.286292 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.284110 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.284162 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.284221 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.284315 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.284332 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.277667 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.286814 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.287038 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.287336 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.287956 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.288441 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.288672 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.288922 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.289082 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.289206 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:01:12.789175122 +0000 UTC m=+19.215048799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.289466 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.290013 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.290129 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.290504 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.290862 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.291251 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.291508 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.291562 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.291569 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.291909 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.291935 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.291976 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.292245 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.292604 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.293240 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.293444 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.293724 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.293743 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.293758 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.293816 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:12.7938003 +0000 UTC m=+19.219674177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.293840 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.293905 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.293919 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.293984 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.294260 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.294449 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.294558 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.294595 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.294624 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.294632 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.294643 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.294678 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.294977 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.295004 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.296206 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.296255 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:12.796233473 +0000 UTC m=+19.222107140 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.296441 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.296468 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.296652 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.296698 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.296864 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.297005 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.297218 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.297296 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.297326 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.297470 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.297541 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.297701 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.297924 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.298008 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.298220 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.299089 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.299883 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.300295 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.300335 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.300371 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.300646 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.303775 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.304533 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.305660 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.305979 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.307201 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.308319 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.308654 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.310080 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.311302 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.313618 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.313516 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.313964 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.313986 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.314053 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.314580 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.314968 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.315727 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.321725 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.322534 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.336814 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.342133 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.346883 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.346904 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.349116 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.366784 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373425 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373483 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373549 4931 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373667 4931 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373685 4931 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373696 4931 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373709 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373719 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373730 4931 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373739 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373747 4931 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373756 4931 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373765 4931 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373774 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373783 4931 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373791 4931 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373801 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373809 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373819 4931 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373829 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373839 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373848 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373859 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373872 4931 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373884 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373896 4931 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373906 4931 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373942 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373953 4931 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373963 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373973 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373982 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.373991 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374001 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374012 4931 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374022 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374032 4931 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374042 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374051 4931 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374060 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374069 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374078 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374087 4931 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374097 4931 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374106 4931 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374118 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374127 4931 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374137 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374147 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374156 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374167 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374176 4931 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374187 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374198 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374657 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374703 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374803 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374820 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374833 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374847 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374859 4931 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374870 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374881 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374892 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374902 4931 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374913 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374927 4931 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374939 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374951 4931 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374962 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374973 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374984 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.374994 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375005 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375016 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375027 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375039 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375050 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375063 4931 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375074 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375083 4931 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375095 4931 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375105 4931 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375116 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375125 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375136 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.375192 4931 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376354 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376409 4931 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376485 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376498 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376512 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376524 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376543 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376554 4931 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376565 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376576 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376594 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376605 4931 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376617 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376628 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376646 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376657 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376668 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376685 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376697 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376707 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376717 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376733 4931 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376743 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376754 4931 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376809 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376828 4931 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376841 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376854 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376870 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376881 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376891 4931 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376904 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376919 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376929 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376939 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376950 4931 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376965 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376974 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376985 4931 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.376996 4931 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377014 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377024 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377034 4931 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377049 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377060 4931 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377071 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377081 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377097 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377107 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377117 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377127 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377143 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377154 4931 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377164 4931 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377174 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377189 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377201 4931 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377214 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377230 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377243 4931 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377253 4931 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377266 4931 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377283 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377293 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377305 4931 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377316 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377331 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377342 4931 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377353 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377370 4931 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377394 4931 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377405 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377416 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377434 4931 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377449 4931 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377460 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377475 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377493 4931 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377507 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377521 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377534 4931 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377553 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377566 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377580 4931 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377600 4931 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377613 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377627 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377642 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377662 4931 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377673 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377685 4931 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377701 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377718 4931 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377729 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377739 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377754 4931 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377768 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377778 4931 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377788 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377803 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377815 4931 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.377827 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.487369 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.492624 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.495029 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.501526 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 15:01:12 crc kubenswrapper[4931]: W1201 15:01:12.509773 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-7e843ae3cef8d5bf2439a4f3ced3ca2bdbd27ed6ce23d3482ff4345b4e2ac33e WatchSource:0}: Error finding container 7e843ae3cef8d5bf2439a4f3ced3ca2bdbd27ed6ce23d3482ff4345b4e2ac33e: Status 404 returned error can't find the container with id 7e843ae3cef8d5bf2439a4f3ced3ca2bdbd27ed6ce23d3482ff4345b4e2ac33e Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.512993 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: W1201 15:01:12.525761 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-c4f7cc73e9b9ba7bb402122a0222364ec9f371c89d36875b06b2bb35dfede5c2 WatchSource:0}: Error finding container c4f7cc73e9b9ba7bb402122a0222364ec9f371c89d36875b06b2bb35dfede5c2: Status 404 returned error can't find the container with id c4f7cc73e9b9ba7bb402122a0222364ec9f371c89d36875b06b2bb35dfede5c2 Dec 01 15:01:12 crc kubenswrapper[4931]: W1201 15:01:12.530312 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-37754ffe476b7591b1d6aff33897644cb9de4e48d3f9e7afb34485e676a0fae3 WatchSource:0}: Error finding container 37754ffe476b7591b1d6aff33897644cb9de4e48d3f9e7afb34485e676a0fae3: Status 404 returned error can't find the container with id 37754ffe476b7591b1d6aff33897644cb9de4e48d3f9e7afb34485e676a0fae3 Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.543760 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.566443 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.587372 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.606491 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.627479 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.639047 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.882137 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.882215 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.882240 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.882266 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:12 crc kubenswrapper[4931]: I1201 15:01:12.882284 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.882442 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.882462 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.882475 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.882521 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:13.88250736 +0000 UTC m=+20.308381027 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.882573 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:01:13.882567282 +0000 UTC m=+20.308440949 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.882624 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.882645 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:13.882638064 +0000 UTC m=+20.308511731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.882688 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.882707 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:13.882701546 +0000 UTC m=+20.308575213 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.882744 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.882753 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.882761 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:12 crc kubenswrapper[4931]: E1201 15:01:12.882777 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:13.882772238 +0000 UTC m=+20.308645905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.240856 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.241057 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.353588 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5"} Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.353663 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366"} Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.353716 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"37754ffe476b7591b1d6aff33897644cb9de4e48d3f9e7afb34485e676a0fae3"} Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.355413 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c4f7cc73e9b9ba7bb402122a0222364ec9f371c89d36875b06b2bb35dfede5c2"} Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.357342 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30"} Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.357400 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7e843ae3cef8d5bf2439a4f3ced3ca2bdbd27ed6ce23d3482ff4345b4e2ac33e"} Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.378408 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.394643 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.414415 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.429958 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.442455 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.453756 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.464702 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.476596 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.487124 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.519269 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.559029 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.577926 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.594474 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.608658 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.792895 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-k8x6d"] Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.793267 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-k8x6d" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.794946 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.795403 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.795681 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.796967 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.817238 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.836544 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.864776 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.886855 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.890956 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.891046 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.891078 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.891125 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:01:15.8910991 +0000 UTC m=+22.316972767 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.891176 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhgq6\" (UniqueName: \"kubernetes.io/projected/62446422-f8d8-45d1-81ef-4228b06c21eb-kube-api-access-hhgq6\") pod \"node-ca-k8x6d\" (UID: \"62446422-f8d8-45d1-81ef-4228b06c21eb\") " pod="openshift-image-registry/node-ca-k8x6d" Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.891214 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.891226 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62446422-f8d8-45d1-81ef-4228b06c21eb-host\") pod \"node-ca-k8x6d\" (UID: \"62446422-f8d8-45d1-81ef-4228b06c21eb\") " pod="openshift-image-registry/node-ca-k8x6d" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.891267 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.891267 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.891292 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:15.891271925 +0000 UTC m=+22.317145672 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.891612 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.891616 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/62446422-f8d8-45d1-81ef-4228b06c21eb-serviceca\") pod \"node-ca-k8x6d\" (UID: \"62446422-f8d8-45d1-81ef-4228b06c21eb\") " pod="openshift-image-registry/node-ca-k8x6d" Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.891629 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.891644 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.891658 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.891712 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:15.891661476 +0000 UTC m=+22.317535133 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.891739 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:15.891729348 +0000 UTC m=+22.317603015 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.891763 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.891786 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.891804 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:13 crc kubenswrapper[4931]: E1201 15:01:13.891869 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:15.891857052 +0000 UTC m=+22.317730889 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.915295 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.928404 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.946313 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.958107 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:13Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.992688 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62446422-f8d8-45d1-81ef-4228b06c21eb-host\") pod \"node-ca-k8x6d\" (UID: \"62446422-f8d8-45d1-81ef-4228b06c21eb\") " pod="openshift-image-registry/node-ca-k8x6d" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.992751 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/62446422-f8d8-45d1-81ef-4228b06c21eb-serviceca\") pod \"node-ca-k8x6d\" (UID: \"62446422-f8d8-45d1-81ef-4228b06c21eb\") " pod="openshift-image-registry/node-ca-k8x6d" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.992822 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhgq6\" (UniqueName: \"kubernetes.io/projected/62446422-f8d8-45d1-81ef-4228b06c21eb-kube-api-access-hhgq6\") pod \"node-ca-k8x6d\" (UID: \"62446422-f8d8-45d1-81ef-4228b06c21eb\") " pod="openshift-image-registry/node-ca-k8x6d" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.992853 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/62446422-f8d8-45d1-81ef-4228b06c21eb-host\") pod \"node-ca-k8x6d\" (UID: \"62446422-f8d8-45d1-81ef-4228b06c21eb\") " pod="openshift-image-registry/node-ca-k8x6d" Dec 01 15:01:13 crc kubenswrapper[4931]: I1201 15:01:13.993975 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/62446422-f8d8-45d1-81ef-4228b06c21eb-serviceca\") pod \"node-ca-k8x6d\" (UID: \"62446422-f8d8-45d1-81ef-4228b06c21eb\") " pod="openshift-image-registry/node-ca-k8x6d" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.037485 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhgq6\" (UniqueName: \"kubernetes.io/projected/62446422-f8d8-45d1-81ef-4228b06c21eb-kube-api-access-hhgq6\") pod \"node-ca-k8x6d\" (UID: \"62446422-f8d8-45d1-81ef-4228b06c21eb\") " pod="openshift-image-registry/node-ca-k8x6d" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.105733 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-k8x6d" Dec 01 15:01:14 crc kubenswrapper[4931]: W1201 15:01:14.116700 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62446422_f8d8_45d1_81ef_4228b06c21eb.slice/crio-a72252bd8a81be3d79838d664955e410dbbf54df84803c46441d415a7ef70723 WatchSource:0}: Error finding container a72252bd8a81be3d79838d664955e410dbbf54df84803c46441d415a7ef70723: Status 404 returned error can't find the container with id a72252bd8a81be3d79838d664955e410dbbf54df84803c46441d415a7ef70723 Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.200349 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6nwqj"] Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.200773 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.207199 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.207260 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.207287 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2qrqd"] Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.207442 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.207488 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.207639 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-crxtx"] Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.207646 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.207735 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2qrqd" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.208179 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-nfb8b"] Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.208473 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.208723 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.213138 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.213338 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.213374 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.213406 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.213411 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.217767 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.218026 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.218174 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.218245 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.218376 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.226081 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.240949 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:14 crc kubenswrapper[4931]: E1201 15:01:14.241083 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.241161 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:14 crc kubenswrapper[4931]: E1201 15:01:14.241212 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.242904 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.246831 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.247546 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.248765 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.249436 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.250401 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.250938 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.251559 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.252527 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.253170 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.254089 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.254348 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.254593 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.255618 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.256146 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.256701 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.257684 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.258194 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.259098 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.259502 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.260048 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.261039 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.261479 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.262463 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.262885 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.263889 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.264283 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.264900 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.266082 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.266558 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.267090 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.267529 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.268029 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.268886 4931 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.268985 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.270781 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.271636 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.272035 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.273542 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.274181 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.275138 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.277962 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.278667 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.279575 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.287143 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.288486 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.290321 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.291109 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.292136 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.292722 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.293773 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.295304 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.295826 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.296684 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.297176 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.301156 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.303651 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.304969 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.305806 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.325181 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.355404 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.361291 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-k8x6d" event={"ID":"62446422-f8d8-45d1-81ef-4228b06c21eb","Type":"ContainerStarted","Data":"9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549"} Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.361339 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-k8x6d" event={"ID":"62446422-f8d8-45d1-81ef-4228b06c21eb","Type":"ContainerStarted","Data":"a72252bd8a81be3d79838d664955e410dbbf54df84803c46441d415a7ef70723"} Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.369083 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.382028 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396474 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04108827-fec1-408b-8fba-feaa1175ed4f-cnibin\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396519 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04108827-fec1-408b-8fba-feaa1175ed4f-cni-binary-copy\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396543 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-multus-socket-dir-parent\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396560 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-var-lib-kubelet\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396579 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f29024b3-c46f-4ef0-8baa-89705f2171f3-hosts-file\") pod \"node-resolver-2qrqd\" (UID: \"f29024b3-c46f-4ef0-8baa-89705f2171f3\") " pod="openshift-dns/node-resolver-2qrqd" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396596 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkzq4\" (UniqueName: \"kubernetes.io/projected/daf46d9f-9b61-4808-ab42-392965da3a7e-kube-api-access-fkzq4\") pod \"machine-config-daemon-crxtx\" (UID: \"daf46d9f-9b61-4808-ab42-392965da3a7e\") " pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396615 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04108827-fec1-408b-8fba-feaa1175ed4f-system-cni-dir\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396642 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd4tj\" (UniqueName: \"kubernetes.io/projected/f29024b3-c46f-4ef0-8baa-89705f2171f3-kube-api-access-jd4tj\") pod \"node-resolver-2qrqd\" (UID: \"f29024b3-c46f-4ef0-8baa-89705f2171f3\") " pod="openshift-dns/node-resolver-2qrqd" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396717 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-run-k8s-cni-cncf-io\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396762 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh4ff\" (UniqueName: \"kubernetes.io/projected/db092a9c-f0f2-401d-82dd-b3af535585cc-kube-api-access-hh4ff\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396818 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-run-multus-certs\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396837 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/daf46d9f-9b61-4808-ab42-392965da3a7e-mcd-auth-proxy-config\") pod \"machine-config-daemon-crxtx\" (UID: \"daf46d9f-9b61-4808-ab42-392965da3a7e\") " pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396883 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-system-cni-dir\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396914 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-os-release\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396932 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-run-netns\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396952 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-hostroot\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.396988 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-etc-kubernetes\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.397024 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-multus-cni-dir\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.397042 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db092a9c-f0f2-401d-82dd-b3af535585cc-cni-binary-copy\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.397059 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-var-lib-cni-multus\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.397092 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgdsm\" (UniqueName: \"kubernetes.io/projected/04108827-fec1-408b-8fba-feaa1175ed4f-kube-api-access-zgdsm\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.397111 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-cnibin\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.397127 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db092a9c-f0f2-401d-82dd-b3af535585cc-multus-daemon-config\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.397145 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04108827-fec1-408b-8fba-feaa1175ed4f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.397165 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-var-lib-cni-bin\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.397244 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-multus-conf-dir\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.397319 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/04108827-fec1-408b-8fba-feaa1175ed4f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.397403 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/daf46d9f-9b61-4808-ab42-392965da3a7e-rootfs\") pod \"machine-config-daemon-crxtx\" (UID: \"daf46d9f-9b61-4808-ab42-392965da3a7e\") " pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.397429 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/daf46d9f-9b61-4808-ab42-392965da3a7e-proxy-tls\") pod \"machine-config-daemon-crxtx\" (UID: \"daf46d9f-9b61-4808-ab42-392965da3a7e\") " pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.397449 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04108827-fec1-408b-8fba-feaa1175ed4f-os-release\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.405862 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.416653 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.427909 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.440578 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.452341 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.462219 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.476951 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.494562 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.497939 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/daf46d9f-9b61-4808-ab42-392965da3a7e-mcd-auth-proxy-config\") pod \"machine-config-daemon-crxtx\" (UID: \"daf46d9f-9b61-4808-ab42-392965da3a7e\") " pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.497989 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-run-multus-certs\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498013 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-system-cni-dir\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498050 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-run-netns\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498075 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-hostroot\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498099 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-etc-kubernetes\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498119 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-system-cni-dir\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498135 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-os-release\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498146 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-run-netns\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498182 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-hostroot\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498160 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-multus-cni-dir\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498165 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-etc-kubernetes\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498222 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-multus-cni-dir\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498245 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db092a9c-f0f2-401d-82dd-b3af535585cc-cni-binary-copy\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498274 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-var-lib-cni-multus\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498317 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgdsm\" (UniqueName: \"kubernetes.io/projected/04108827-fec1-408b-8fba-feaa1175ed4f-kube-api-access-zgdsm\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498338 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-cnibin\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498350 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-var-lib-cni-multus\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498359 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db092a9c-f0f2-401d-82dd-b3af535585cc-multus-daemon-config\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498415 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04108827-fec1-408b-8fba-feaa1175ed4f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498472 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-var-lib-cni-bin\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498502 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-os-release\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498502 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-multus-conf-dir\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498534 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-multus-conf-dir\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498542 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/04108827-fec1-408b-8fba-feaa1175ed4f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498589 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/daf46d9f-9b61-4808-ab42-392965da3a7e-proxy-tls\") pod \"machine-config-daemon-crxtx\" (UID: \"daf46d9f-9b61-4808-ab42-392965da3a7e\") " pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498610 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04108827-fec1-408b-8fba-feaa1175ed4f-os-release\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498639 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/daf46d9f-9b61-4808-ab42-392965da3a7e-rootfs\") pod \"machine-config-daemon-crxtx\" (UID: \"daf46d9f-9b61-4808-ab42-392965da3a7e\") " pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498669 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04108827-fec1-408b-8fba-feaa1175ed4f-cnibin\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498690 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04108827-fec1-408b-8fba-feaa1175ed4f-cni-binary-copy\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498726 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-multus-socket-dir-parent\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498743 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-var-lib-kubelet\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498760 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f29024b3-c46f-4ef0-8baa-89705f2171f3-hosts-file\") pod \"node-resolver-2qrqd\" (UID: \"f29024b3-c46f-4ef0-8baa-89705f2171f3\") " pod="openshift-dns/node-resolver-2qrqd" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498779 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkzq4\" (UniqueName: \"kubernetes.io/projected/daf46d9f-9b61-4808-ab42-392965da3a7e-kube-api-access-fkzq4\") pod \"machine-config-daemon-crxtx\" (UID: \"daf46d9f-9b61-4808-ab42-392965da3a7e\") " pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498796 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04108827-fec1-408b-8fba-feaa1175ed4f-system-cni-dir\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498815 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-run-k8s-cni-cncf-io\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498831 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh4ff\" (UniqueName: \"kubernetes.io/projected/db092a9c-f0f2-401d-82dd-b3af535585cc-kube-api-access-hh4ff\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.498864 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd4tj\" (UniqueName: \"kubernetes.io/projected/f29024b3-c46f-4ef0-8baa-89705f2171f3-kube-api-access-jd4tj\") pod \"node-resolver-2qrqd\" (UID: \"f29024b3-c46f-4ef0-8baa-89705f2171f3\") " pod="openshift-dns/node-resolver-2qrqd" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.499015 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04108827-fec1-408b-8fba-feaa1175ed4f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.499159 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db092a9c-f0f2-401d-82dd-b3af535585cc-multus-daemon-config\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.499528 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/daf46d9f-9b61-4808-ab42-392965da3a7e-mcd-auth-proxy-config\") pod \"machine-config-daemon-crxtx\" (UID: \"daf46d9f-9b61-4808-ab42-392965da3a7e\") " pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.499543 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-var-lib-cni-bin\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.499580 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-multus-socket-dir-parent\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.499607 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/daf46d9f-9b61-4808-ab42-392965da3a7e-rootfs\") pod \"machine-config-daemon-crxtx\" (UID: \"daf46d9f-9b61-4808-ab42-392965da3a7e\") " pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.499638 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-var-lib-kubelet\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.499696 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f29024b3-c46f-4ef0-8baa-89705f2171f3-hosts-file\") pod \"node-resolver-2qrqd\" (UID: \"f29024b3-c46f-4ef0-8baa-89705f2171f3\") " pod="openshift-dns/node-resolver-2qrqd" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.499713 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04108827-fec1-408b-8fba-feaa1175ed4f-system-cni-dir\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.499731 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-cnibin\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.499745 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-run-k8s-cni-cncf-io\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.499805 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04108827-fec1-408b-8fba-feaa1175ed4f-cnibin\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.499537 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04108827-fec1-408b-8fba-feaa1175ed4f-os-release\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.500135 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db092a9c-f0f2-401d-82dd-b3af535585cc-host-run-multus-certs\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.500149 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db092a9c-f0f2-401d-82dd-b3af535585cc-cni-binary-copy\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.500292 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/04108827-fec1-408b-8fba-feaa1175ed4f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.500372 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04108827-fec1-408b-8fba-feaa1175ed4f-cni-binary-copy\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.504403 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/daf46d9f-9b61-4808-ab42-392965da3a7e-proxy-tls\") pod \"machine-config-daemon-crxtx\" (UID: \"daf46d9f-9b61-4808-ab42-392965da3a7e\") " pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.519059 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.524011 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd4tj\" (UniqueName: \"kubernetes.io/projected/f29024b3-c46f-4ef0-8baa-89705f2171f3-kube-api-access-jd4tj\") pod \"node-resolver-2qrqd\" (UID: \"f29024b3-c46f-4ef0-8baa-89705f2171f3\") " pod="openshift-dns/node-resolver-2qrqd" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.528517 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgdsm\" (UniqueName: \"kubernetes.io/projected/04108827-fec1-408b-8fba-feaa1175ed4f-kube-api-access-zgdsm\") pod \"multus-additional-cni-plugins-nfb8b\" (UID: \"04108827-fec1-408b-8fba-feaa1175ed4f\") " pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.530646 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2qrqd" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.532994 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkzq4\" (UniqueName: \"kubernetes.io/projected/daf46d9f-9b61-4808-ab42-392965da3a7e-kube-api-access-fkzq4\") pod \"machine-config-daemon-crxtx\" (UID: \"daf46d9f-9b61-4808-ab42-392965da3a7e\") " pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.532999 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh4ff\" (UniqueName: \"kubernetes.io/projected/db092a9c-f0f2-401d-82dd-b3af535585cc-kube-api-access-hh4ff\") pod \"multus-6nwqj\" (UID: \"db092a9c-f0f2-401d-82dd-b3af535585cc\") " pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.539219 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.540194 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.545497 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.556595 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.597917 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.602303 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v5g28"] Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.603138 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.608538 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.608907 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.609076 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.609442 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.609624 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.609737 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.623513 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.629583 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.651699 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.675838 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.700692 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705133 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-env-overrides\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705175 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705204 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-etc-openvswitch\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705219 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-openvswitch\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705234 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-node-log\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705249 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-run-ovn-kubernetes\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705264 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovnkube-config\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705280 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b56b\" (UniqueName: \"kubernetes.io/projected/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-kube-api-access-8b56b\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705294 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-cni-bin\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705307 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovnkube-script-lib\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705341 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-log-socket\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705364 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-systemd-units\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705396 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-cni-netd\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705413 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovn-node-metrics-cert\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705432 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-run-netns\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705450 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-ovn\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705519 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-systemd\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705636 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-slash\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705689 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-var-lib-openvswitch\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.705734 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-kubelet\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.737582 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.766787 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.782228 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.802453 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806587 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-systemd-units\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806625 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-cni-netd\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806643 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovn-node-metrics-cert\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806661 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-run-netns\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806690 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-ovn\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806765 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-slash\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806765 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-cni-netd\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806785 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-systemd\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806804 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-systemd-units\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806806 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-kubelet\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806847 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-slash\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806874 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-systemd\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806858 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-var-lib-openvswitch\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806952 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-env-overrides\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806966 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-var-lib-openvswitch\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806981 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-etc-openvswitch\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806794 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-run-netns\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807001 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807028 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-etc-openvswitch\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806849 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-ovn\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807050 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-openvswitch\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807071 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-node-log\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807075 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.806852 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-kubelet\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807089 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-run-ovn-kubernetes\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807113 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-run-ovn-kubernetes\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807119 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovnkube-config\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807137 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b56b\" (UniqueName: \"kubernetes.io/projected/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-kube-api-access-8b56b\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807142 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-node-log\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807154 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-cni-bin\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807175 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovnkube-script-lib\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807195 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-log-socket\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807243 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-log-socket\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807253 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-openvswitch\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807271 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-cni-bin\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807642 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-env-overrides\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.807724 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovnkube-config\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.808128 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovnkube-script-lib\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.811692 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovn-node-metrics-cert\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.823497 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6nwqj" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.825331 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b56b\" (UniqueName: \"kubernetes.io/projected/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-kube-api-access-8b56b\") pod \"ovnkube-node-v5g28\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.825281 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: W1201 15:01:14.841753 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb092a9c_f0f2_401d_82dd_b3af535585cc.slice/crio-fc1e1a7ebeaa55827649885d2744d15812529dd0808e97a695e22b7c3c651f7f WatchSource:0}: Error finding container fc1e1a7ebeaa55827649885d2744d15812529dd0808e97a695e22b7c3c651f7f: Status 404 returned error can't find the container with id fc1e1a7ebeaa55827649885d2744d15812529dd0808e97a695e22b7c3c651f7f Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.845289 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.865014 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.878510 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.900980 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.917928 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.928356 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.931618 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: W1201 15:01:14.939724 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16e4fd4a_b253_4b2f_8f42_ddbfc4dd8f5a.slice/crio-2801ef2d3595c72cae9f7b23b6de9dcce0d98feecb7e8081c177bc10479a51c8 WatchSource:0}: Error finding container 2801ef2d3595c72cae9f7b23b6de9dcce0d98feecb7e8081c177bc10479a51c8: Status 404 returned error can't find the container with id 2801ef2d3595c72cae9f7b23b6de9dcce0d98feecb7e8081c177bc10479a51c8 Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.948542 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.966719 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.981120 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:14 crc kubenswrapper[4931]: I1201 15:01:14.992867 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.012553 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.030322 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.036021 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.044348 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.052131 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.054043 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.057440 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.070644 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.091236 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.110346 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.124003 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.139067 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.159437 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.181107 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.194442 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.208534 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.227317 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.239679 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.241060 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.241236 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.253623 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.269867 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.285795 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.297165 4931 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.299714 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.299743 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.299751 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.299867 4931 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.304199 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.336729 4931 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.336999 4931 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.338093 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.338243 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.338324 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.338433 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.338509 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:15Z","lastTransitionTime":"2025-12-01T15:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.358844 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.365469 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.365592 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.365674 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.365759 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.365842 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:15Z","lastTransitionTime":"2025-12-01T15:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.366953 4931 generic.go:334] "Generic (PLEG): container finished" podID="04108827-fec1-408b-8fba-feaa1175ed4f" containerID="a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056" exitCode=0 Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.367034 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" event={"ID":"04108827-fec1-408b-8fba-feaa1175ed4f","Type":"ContainerDied","Data":"a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.367070 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" event={"ID":"04108827-fec1-408b-8fba-feaa1175ed4f","Type":"ContainerStarted","Data":"b36ed1b3abfc5b3b71530a6c50966ce3f74ac08d39c6a1480514b69660904ba0"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.370285 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.370359 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.370377 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"de557bc1b6ff5b81b416f4ea4d1f3edda7c782691eaa7d0c829f730531f3f523"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.377962 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2qrqd" event={"ID":"f29024b3-c46f-4ef0-8baa-89705f2171f3","Type":"ContainerStarted","Data":"20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.378034 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2qrqd" event={"ID":"f29024b3-c46f-4ef0-8baa-89705f2171f3","Type":"ContainerStarted","Data":"84f769fa751534607b6793999e095537343c294fab5de81d229032d008289cb2"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.380823 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549"} Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.382356 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.383481 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6nwqj" event={"ID":"db092a9c-f0f2-401d-82dd-b3af535585cc","Type":"ContainerStarted","Data":"a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.383520 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6nwqj" event={"ID":"db092a9c-f0f2-401d-82dd-b3af535585cc","Type":"ContainerStarted","Data":"fc1e1a7ebeaa55827649885d2744d15812529dd0808e97a695e22b7c3c651f7f"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.385908 4931 generic.go:334] "Generic (PLEG): container finished" podID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerID="9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191" exitCode=0 Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.386164 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.386192 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerStarted","Data":"2801ef2d3595c72cae9f7b23b6de9dcce0d98feecb7e8081c177bc10479a51c8"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.387240 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.387258 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.387267 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.387280 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.387290 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:15Z","lastTransitionTime":"2025-12-01T15:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.389067 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.404143 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.407290 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.407416 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.407530 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.407668 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.407799 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:15Z","lastTransitionTime":"2025-12-01T15:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.418769 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.423020 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.424882 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.424946 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.424958 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.425001 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.425016 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:15Z","lastTransitionTime":"2025-12-01T15:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.435671 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.435890 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.438187 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.438230 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.438246 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.438266 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.438281 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:15Z","lastTransitionTime":"2025-12-01T15:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.462321 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.512747 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.541947 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.541999 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.542011 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.542032 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.542046 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:15Z","lastTransitionTime":"2025-12-01T15:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.547450 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.584015 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.630496 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.644954 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.645149 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.645161 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.645176 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.645186 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:15Z","lastTransitionTime":"2025-12-01T15:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.670739 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.706079 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.747406 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.748519 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.748556 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.748566 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.748581 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.748591 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:15Z","lastTransitionTime":"2025-12-01T15:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.787666 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.824892 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.850761 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.850810 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.850826 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.850852 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.850868 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:15Z","lastTransitionTime":"2025-12-01T15:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.871057 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.909936 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.920761 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.920877 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.920908 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.920936 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.920958 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.921038 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.921096 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:19.921081049 +0000 UTC m=+26.346954706 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.921111 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.921117 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.921263 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.921282 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.921117 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.921330 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.921337 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.921377 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:19.921202723 +0000 UTC m=+26.347076390 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.921412 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:19.921406559 +0000 UTC m=+26.347280226 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.921431 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:19.921425609 +0000 UTC m=+26.347299276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:15 crc kubenswrapper[4931]: E1201 15:01:15.921482 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:01:19.921474331 +0000 UTC m=+26.347347998 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.953531 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.953575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.953584 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.953601 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.953613 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:15Z","lastTransitionTime":"2025-12-01T15:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.974713 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:15 crc kubenswrapper[4931]: I1201 15:01:15.997692 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:15Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.026300 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.056194 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.056224 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.056232 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.056251 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.056261 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:16Z","lastTransitionTime":"2025-12-01T15:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.067586 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.111372 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.145013 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.158642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.158689 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.158701 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.158716 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.158726 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:16Z","lastTransitionTime":"2025-12-01T15:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.187105 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.224841 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.240694 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.240707 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:16 crc kubenswrapper[4931]: E1201 15:01:16.240841 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:16 crc kubenswrapper[4931]: E1201 15:01:16.240914 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.261567 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.261606 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.261618 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.261636 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.261649 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:16Z","lastTransitionTime":"2025-12-01T15:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.267396 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.308535 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.345433 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.364846 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.364893 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.364907 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.364927 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.364942 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:16Z","lastTransitionTime":"2025-12-01T15:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.390871 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.393983 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerStarted","Data":"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.394031 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerStarted","Data":"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.394048 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerStarted","Data":"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.394064 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerStarted","Data":"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.394080 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerStarted","Data":"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.394094 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerStarted","Data":"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.396724 4931 generic.go:334] "Generic (PLEG): container finished" podID="04108827-fec1-408b-8fba-feaa1175ed4f" containerID="cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78" exitCode=0 Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.396805 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" event={"ID":"04108827-fec1-408b-8fba-feaa1175ed4f","Type":"ContainerDied","Data":"cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.429564 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.463792 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.467950 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.467996 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.468011 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.468029 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.468045 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:16Z","lastTransitionTime":"2025-12-01T15:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.502955 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.506226 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.507479 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.526716 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.565851 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.571578 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.571628 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.571642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.571661 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.571676 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:16Z","lastTransitionTime":"2025-12-01T15:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.614997 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.646999 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.673671 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.673717 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.673727 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.673745 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.673756 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:16Z","lastTransitionTime":"2025-12-01T15:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.683854 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.723625 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.768137 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.776401 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.776438 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.776450 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.776474 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.776487 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:16Z","lastTransitionTime":"2025-12-01T15:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.805763 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.842173 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.881125 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.881177 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.881189 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.881208 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.881221 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:16Z","lastTransitionTime":"2025-12-01T15:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.887778 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.931608 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.967441 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:16Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.984021 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.984074 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.984091 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.984116 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:16 crc kubenswrapper[4931]: I1201 15:01:16.984134 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:16Z","lastTransitionTime":"2025-12-01T15:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.007620 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.045608 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.084771 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.086617 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.086659 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.086671 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.086691 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.086701 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:17Z","lastTransitionTime":"2025-12-01T15:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.129969 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.166370 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.190144 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.190218 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.190237 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.190263 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.190283 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:17Z","lastTransitionTime":"2025-12-01T15:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.211643 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.240641 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:17 crc kubenswrapper[4931]: E1201 15:01:17.240832 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.247341 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.283904 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.293323 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.293361 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.293371 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.293402 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.293413 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:17Z","lastTransitionTime":"2025-12-01T15:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.337462 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.366102 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.396275 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.396324 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.396340 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.396360 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.396375 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:17Z","lastTransitionTime":"2025-12-01T15:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.402006 4931 generic.go:334] "Generic (PLEG): container finished" podID="04108827-fec1-408b-8fba-feaa1175ed4f" containerID="fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1" exitCode=0 Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.402091 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" event={"ID":"04108827-fec1-408b-8fba-feaa1175ed4f","Type":"ContainerDied","Data":"fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1"} Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.414881 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: E1201 15:01:17.421582 4931 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.471208 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.499870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.499915 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.499928 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.499944 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.499957 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:17Z","lastTransitionTime":"2025-12-01T15:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.522335 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.552463 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.584788 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.602091 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.602121 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.602130 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.602143 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.602155 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:17Z","lastTransitionTime":"2025-12-01T15:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.623902 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.663327 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.704788 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.707100 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.707143 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.707155 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.707171 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.707186 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:17Z","lastTransitionTime":"2025-12-01T15:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.753064 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.784530 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.810269 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.810308 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.810316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.810332 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.810342 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:17Z","lastTransitionTime":"2025-12-01T15:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.822633 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.864916 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.908629 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.912627 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.912678 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.912693 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.912714 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.912727 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:17Z","lastTransitionTime":"2025-12-01T15:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.949215 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:17 crc kubenswrapper[4931]: I1201 15:01:17.985419 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.015055 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.015104 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.015113 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.015129 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.015138 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:18Z","lastTransitionTime":"2025-12-01T15:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.021994 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.064060 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.117704 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.117756 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.117770 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.117790 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.117807 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:18Z","lastTransitionTime":"2025-12-01T15:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.119513 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.146703 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.190308 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.220687 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.220728 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.220740 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.220757 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.220770 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:18Z","lastTransitionTime":"2025-12-01T15:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.226066 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.240872 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.240996 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:18 crc kubenswrapper[4931]: E1201 15:01:18.241149 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:18 crc kubenswrapper[4931]: E1201 15:01:18.241267 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.269246 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.323415 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.323458 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.323467 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.323483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.323494 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:18Z","lastTransitionTime":"2025-12-01T15:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.408794 4931 generic.go:334] "Generic (PLEG): container finished" podID="04108827-fec1-408b-8fba-feaa1175ed4f" containerID="311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef" exitCode=0 Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.408858 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" event={"ID":"04108827-fec1-408b-8fba-feaa1175ed4f","Type":"ContainerDied","Data":"311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef"} Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.424239 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.426137 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.426182 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.426194 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.426210 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.426219 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:18Z","lastTransitionTime":"2025-12-01T15:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.448228 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.472924 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.496036 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.509528 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.530524 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.530565 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.530575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.530590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.530600 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:18Z","lastTransitionTime":"2025-12-01T15:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.531100 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.544426 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.584632 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.625970 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.633403 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.633463 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.633476 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.633500 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.633516 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:18Z","lastTransitionTime":"2025-12-01T15:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.667910 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.703521 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.736126 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.736175 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.736191 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.736216 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.736231 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:18Z","lastTransitionTime":"2025-12-01T15:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.742718 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.791996 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.825546 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.839789 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.839818 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.839827 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.839841 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.839850 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:18Z","lastTransitionTime":"2025-12-01T15:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.866782 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:18Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.942991 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.943025 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.943034 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.943048 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:18 crc kubenswrapper[4931]: I1201 15:01:18.943057 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:18Z","lastTransitionTime":"2025-12-01T15:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.045424 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.045474 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.045486 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.045503 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.045515 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:19Z","lastTransitionTime":"2025-12-01T15:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.147405 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.147449 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.147458 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.147473 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.147484 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:19Z","lastTransitionTime":"2025-12-01T15:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.241319 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.241475 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.250205 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.250240 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.250249 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.250268 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.250278 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:19Z","lastTransitionTime":"2025-12-01T15:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.351978 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.352017 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.352027 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.352044 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.352054 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:19Z","lastTransitionTime":"2025-12-01T15:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.415750 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerStarted","Data":"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73"} Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.418071 4931 generic.go:334] "Generic (PLEG): container finished" podID="04108827-fec1-408b-8fba-feaa1175ed4f" containerID="f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad" exitCode=0 Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.418113 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" event={"ID":"04108827-fec1-408b-8fba-feaa1175ed4f","Type":"ContainerDied","Data":"f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad"} Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.439463 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.452346 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.455232 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.455260 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.455268 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.455283 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.455293 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:19Z","lastTransitionTime":"2025-12-01T15:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.472953 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.487541 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.501047 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.513305 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.524244 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.536130 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.547922 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.557506 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.557545 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.557556 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.557573 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.557584 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:19Z","lastTransitionTime":"2025-12-01T15:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.560542 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.574354 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.583957 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.601871 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.613095 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.628048 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:19Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.660529 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.660573 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.660586 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.660605 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.660616 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:19Z","lastTransitionTime":"2025-12-01T15:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.763400 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.763446 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.763458 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.763476 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.763491 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:19Z","lastTransitionTime":"2025-12-01T15:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.866416 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.866481 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.866505 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.866535 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.866555 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:19Z","lastTransitionTime":"2025-12-01T15:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.958899 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.958972 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.958997 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.959023 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.959048 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.959159 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.959210 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:27.959196916 +0000 UTC m=+34.385070583 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.959693 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.959730 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.959747 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.959789 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.959816 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:27.959788604 +0000 UTC m=+34.385662271 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.959885 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.959929 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:01:27.959906317 +0000 UTC m=+34.385780024 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.959934 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.959970 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.959999 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:27.95998464 +0000 UTC m=+34.385858347 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:19 crc kubenswrapper[4931]: E1201 15:01:19.960053 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:27.960018661 +0000 UTC m=+34.385892388 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.970595 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.970645 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.970669 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.970700 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:19 crc kubenswrapper[4931]: I1201 15:01:19.970725 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:19Z","lastTransitionTime":"2025-12-01T15:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.073979 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.074044 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.074069 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.074101 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.074124 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:20Z","lastTransitionTime":"2025-12-01T15:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.176454 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.176499 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.176511 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.176530 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.176541 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:20Z","lastTransitionTime":"2025-12-01T15:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.241040 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.241139 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:20 crc kubenswrapper[4931]: E1201 15:01:20.241259 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:20 crc kubenswrapper[4931]: E1201 15:01:20.241557 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.280275 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.280304 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.280313 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.280326 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.280336 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:20Z","lastTransitionTime":"2025-12-01T15:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.383721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.383774 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.383792 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.383816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.383836 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:20Z","lastTransitionTime":"2025-12-01T15:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.426417 4931 generic.go:334] "Generic (PLEG): container finished" podID="04108827-fec1-408b-8fba-feaa1175ed4f" containerID="1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893" exitCode=0 Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.426485 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" event={"ID":"04108827-fec1-408b-8fba-feaa1175ed4f","Type":"ContainerDied","Data":"1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893"} Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.453868 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.477080 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.490181 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.490263 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.490287 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.490320 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.490345 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:20Z","lastTransitionTime":"2025-12-01T15:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.505874 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.521226 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.534700 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.547520 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.561420 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.576691 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.593250 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.594102 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.594219 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.594337 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.594501 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.594664 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:20Z","lastTransitionTime":"2025-12-01T15:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.613162 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.623616 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.636836 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.664370 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.677420 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.691330 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:20Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.696834 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.696862 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.696871 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.696884 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.696893 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:20Z","lastTransitionTime":"2025-12-01T15:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.799296 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.799351 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.799369 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.799433 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.799452 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:20Z","lastTransitionTime":"2025-12-01T15:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.902119 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.902167 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.902177 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.902192 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:20 crc kubenswrapper[4931]: I1201 15:01:20.902203 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:20Z","lastTransitionTime":"2025-12-01T15:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.005331 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.005432 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.005454 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.005483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.005500 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:21Z","lastTransitionTime":"2025-12-01T15:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.109102 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.109186 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.109206 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.109230 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.109248 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:21Z","lastTransitionTime":"2025-12-01T15:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.212330 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.212417 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.212436 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.212461 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.212478 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:21Z","lastTransitionTime":"2025-12-01T15:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.240436 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:21 crc kubenswrapper[4931]: E1201 15:01:21.240627 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.316364 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.316441 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.316461 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.316488 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.316507 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:21Z","lastTransitionTime":"2025-12-01T15:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.419073 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.419117 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.419134 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.419151 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.419164 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:21Z","lastTransitionTime":"2025-12-01T15:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.435713 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerStarted","Data":"f4c617579ec2978a3019775fbf8f550aea2c98dec7e877416c3ffabc4c8dd6cf"} Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.435991 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.436029 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.436039 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.440053 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" event={"ID":"04108827-fec1-408b-8fba-feaa1175ed4f","Type":"ContainerStarted","Data":"191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425"} Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.449951 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.464275 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.466109 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.466448 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.477295 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.494803 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c617579ec2978a3019775fbf8f550aea2c98dec7e877416c3ffabc4c8dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.506867 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.518628 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.521721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.521761 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.521773 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.521792 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.521805 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:21Z","lastTransitionTime":"2025-12-01T15:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.532225 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.550545 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.571293 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.587506 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.601205 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.620217 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.623748 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.623794 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.623803 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.623820 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.623830 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:21Z","lastTransitionTime":"2025-12-01T15:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.637961 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.653764 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.683164 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.696748 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.714215 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.726729 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.726772 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.726781 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.726801 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.726812 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:21Z","lastTransitionTime":"2025-12-01T15:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.730911 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.768055 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.791493 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.807780 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.829461 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.829516 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.829533 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.829556 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.829570 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:21Z","lastTransitionTime":"2025-12-01T15:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.840733 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c617579ec2978a3019775fbf8f550aea2c98dec7e877416c3ffabc4c8dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.855137 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.870843 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.885207 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.896844 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.914175 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.931609 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.932032 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.932068 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.932082 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.932102 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.932114 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:21Z","lastTransitionTime":"2025-12-01T15:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.946862 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:21 crc kubenswrapper[4931]: I1201 15:01:21.960355 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:21Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.035174 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.035236 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.035254 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.035285 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.035302 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:22Z","lastTransitionTime":"2025-12-01T15:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.139631 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.139699 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.139718 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.139751 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.139770 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:22Z","lastTransitionTime":"2025-12-01T15:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.241093 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.241171 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:22 crc kubenswrapper[4931]: E1201 15:01:22.241267 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:22 crc kubenswrapper[4931]: E1201 15:01:22.241469 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.243767 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.243843 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.243871 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.243908 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.243933 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:22Z","lastTransitionTime":"2025-12-01T15:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.346986 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.347052 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.347072 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.347102 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.347122 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:22Z","lastTransitionTime":"2025-12-01T15:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.449285 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.449344 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.449360 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.449402 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.449416 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:22Z","lastTransitionTime":"2025-12-01T15:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.553012 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.553084 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.553104 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.553133 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.553153 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:22Z","lastTransitionTime":"2025-12-01T15:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.657149 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.657238 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.657263 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.657297 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.657318 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:22Z","lastTransitionTime":"2025-12-01T15:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.761687 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.761770 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.761788 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.761815 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.761841 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:22Z","lastTransitionTime":"2025-12-01T15:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.864772 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.864831 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.864848 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.864869 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.864884 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:22Z","lastTransitionTime":"2025-12-01T15:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.968492 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.968552 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.968568 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.968587 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:22 crc kubenswrapper[4931]: I1201 15:01:22.968599 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:22Z","lastTransitionTime":"2025-12-01T15:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.072777 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.072850 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.072868 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.072895 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.072914 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:23Z","lastTransitionTime":"2025-12-01T15:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.185643 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.185701 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.185721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.185746 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.185765 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:23Z","lastTransitionTime":"2025-12-01T15:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.240763 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:23 crc kubenswrapper[4931]: E1201 15:01:23.240925 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.288813 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.288862 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.288872 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.288891 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.288901 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:23Z","lastTransitionTime":"2025-12-01T15:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.391467 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.391509 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.391519 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.391538 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.391548 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:23Z","lastTransitionTime":"2025-12-01T15:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.494673 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.494749 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.494772 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.494797 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.494814 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:23Z","lastTransitionTime":"2025-12-01T15:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.597586 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.597807 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.597818 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.597835 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.597846 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:23Z","lastTransitionTime":"2025-12-01T15:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.701040 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.701094 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.701109 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.701130 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.701145 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:23Z","lastTransitionTime":"2025-12-01T15:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.803534 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.803594 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.803607 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.803623 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.803633 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:23Z","lastTransitionTime":"2025-12-01T15:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.905600 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.905637 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.905649 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.905666 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:23 crc kubenswrapper[4931]: I1201 15:01:23.905678 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:23Z","lastTransitionTime":"2025-12-01T15:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.009329 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.009399 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.009413 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.009432 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.009445 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:24Z","lastTransitionTime":"2025-12-01T15:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.112861 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.112908 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.112920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.112938 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.112951 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:24Z","lastTransitionTime":"2025-12-01T15:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.215969 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.216039 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.216064 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.216095 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.216122 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:24Z","lastTransitionTime":"2025-12-01T15:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.241332 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.241477 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:24 crc kubenswrapper[4931]: E1201 15:01:24.241764 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:24 crc kubenswrapper[4931]: E1201 15:01:24.241854 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.264428 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.281098 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.297518 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.313703 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.318166 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.318204 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.318218 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.318235 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.318248 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:24Z","lastTransitionTime":"2025-12-01T15:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.337335 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.351213 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.366693 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.379173 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.396846 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.416483 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.421188 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.421241 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.421254 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.421273 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.421286 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:24Z","lastTransitionTime":"2025-12-01T15:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.436879 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.450252 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.452006 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/0.log" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.456944 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"f4c617579ec2978a3019775fbf8f550aea2c98dec7e877416c3ffabc4c8dd6cf"} Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.456894 4931 generic.go:334] "Generic (PLEG): container finished" podID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerID="f4c617579ec2978a3019775fbf8f550aea2c98dec7e877416c3ffabc4c8dd6cf" exitCode=1 Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.458542 4931 scope.go:117] "RemoveContainer" containerID="f4c617579ec2978a3019775fbf8f550aea2c98dec7e877416c3ffabc4c8dd6cf" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.468953 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.485965 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.517987 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c617579ec2978a3019775fbf8f550aea2c98dec7e877416c3ffabc4c8dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.524057 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.524108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.524126 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.524149 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.524163 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:24Z","lastTransitionTime":"2025-12-01T15:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.534574 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.551338 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.565894 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.577429 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.606121 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c617579ec2978a3019775fbf8f550aea2c98dec7e877416c3ffabc4c8dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4c617579ec2978a3019775fbf8f550aea2c98dec7e877416c3ffabc4c8dd6cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:23Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 15:01:23.724456 6248 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 15:01:23.724476 6248 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 15:01:23.724499 6248 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 15:01:23.724509 6248 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 15:01:23.724525 6248 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 15:01:23.724537 6248 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 15:01:23.724549 6248 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 15:01:23.724554 6248 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 15:01:23.724598 6248 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 15:01:23.724599 6248 factory.go:656] Stopping watch factory\\\\nI1201 15:01:23.724608 6248 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 15:01:23.724609 6248 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 15:01:23.724614 6248 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 15:01:23.724630 6248 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 15:01:23.724637 6248 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.620848 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.628737 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.628814 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.628834 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.628859 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.628875 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:24Z","lastTransitionTime":"2025-12-01T15:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.638962 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.657353 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.675002 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.695662 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.710299 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.731887 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.731953 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.731972 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.732006 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.732026 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:24Z","lastTransitionTime":"2025-12-01T15:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.739655 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.761877 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.789199 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.807029 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.835433 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.835479 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.835494 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.835516 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.835532 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:24Z","lastTransitionTime":"2025-12-01T15:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.937816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.937869 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.937882 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.937902 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:24 crc kubenswrapper[4931]: I1201 15:01:24.937915 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:24Z","lastTransitionTime":"2025-12-01T15:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.040083 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.040146 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.040158 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.040178 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.040191 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.142604 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.142659 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.142680 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.142704 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.142720 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.240759 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:25 crc kubenswrapper[4931]: E1201 15:01:25.240897 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.245475 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.245513 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.245525 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.245542 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.245553 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.347928 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.347976 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.347987 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.348006 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.348018 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.450497 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.450534 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.450543 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.450558 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.450568 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.465607 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/0.log" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.469951 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerStarted","Data":"4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193"} Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.470420 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.483427 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.493467 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.519080 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.535967 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.553316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.553370 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.553399 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.553419 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.553433 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.554908 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.571450 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.585188 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.601014 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.618341 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.640788 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.652143 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.656111 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.656175 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.656199 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.656231 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.656252 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.666549 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.692656 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4c617579ec2978a3019775fbf8f550aea2c98dec7e877416c3ffabc4c8dd6cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:23Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 15:01:23.724456 6248 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 15:01:23.724476 6248 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 15:01:23.724499 6248 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 15:01:23.724509 6248 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 15:01:23.724525 6248 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 15:01:23.724537 6248 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 15:01:23.724549 6248 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 15:01:23.724554 6248 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 15:01:23.724598 6248 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 15:01:23.724599 6248 factory.go:656] Stopping watch factory\\\\nI1201 15:01:23.724608 6248 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 15:01:23.724609 6248 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 15:01:23.724614 6248 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 15:01:23.724630 6248 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 15:01:23.724637 6248 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.709487 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.728864 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.759226 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.759267 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.759278 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.759299 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.759310 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.815732 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.815783 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.815799 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.815817 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.815832 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: E1201 15:01:25.837061 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.841849 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.841917 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.841939 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.841967 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.841990 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: E1201 15:01:25.864007 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.868954 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.869024 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.869049 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.869084 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.869109 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: E1201 15:01:25.884146 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.888103 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.888171 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.888194 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.888221 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.888244 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: E1201 15:01:25.905650 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.910937 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.910986 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.911003 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.911030 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.911046 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:25 crc kubenswrapper[4931]: E1201 15:01:25.930474 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:25 crc kubenswrapper[4931]: E1201 15:01:25.930708 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.934070 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.934165 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.934188 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.934221 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:25 crc kubenswrapper[4931]: I1201 15:01:25.934254 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:25Z","lastTransitionTime":"2025-12-01T15:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.038241 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.038318 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.038337 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.038364 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.038382 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:26Z","lastTransitionTime":"2025-12-01T15:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.142075 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.142139 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.142188 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.142215 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.142235 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:26Z","lastTransitionTime":"2025-12-01T15:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.240865 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.240903 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:26 crc kubenswrapper[4931]: E1201 15:01:26.241069 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:26 crc kubenswrapper[4931]: E1201 15:01:26.241267 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.246350 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.246508 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.246532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.246573 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.246617 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:26Z","lastTransitionTime":"2025-12-01T15:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.350105 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.350186 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.350207 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.350234 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.350253 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:26Z","lastTransitionTime":"2025-12-01T15:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.454081 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.454137 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.454149 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.454170 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.454185 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:26Z","lastTransitionTime":"2025-12-01T15:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.476683 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/1.log" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.477700 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/0.log" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.482133 4931 generic.go:334] "Generic (PLEG): container finished" podID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerID="4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193" exitCode=1 Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.482192 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193"} Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.482255 4931 scope.go:117] "RemoveContainer" containerID="f4c617579ec2978a3019775fbf8f550aea2c98dec7e877416c3ffabc4c8dd6cf" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.484179 4931 scope.go:117] "RemoveContainer" containerID="4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193" Dec 01 15:01:26 crc kubenswrapper[4931]: E1201 15:01:26.484523 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.551569 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.556326 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.556368 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.556404 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.556423 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.556436 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:26Z","lastTransitionTime":"2025-12-01T15:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.569133 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.588630 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.604521 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.623573 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.641050 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.660239 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.660664 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.660704 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.660721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.660746 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.660763 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:26Z","lastTransitionTime":"2025-12-01T15:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.675974 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.693174 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.704643 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.722011 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4c617579ec2978a3019775fbf8f550aea2c98dec7e877416c3ffabc4c8dd6cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:23Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 15:01:23.724456 6248 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 15:01:23.724476 6248 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 15:01:23.724499 6248 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 15:01:23.724509 6248 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 15:01:23.724525 6248 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 15:01:23.724537 6248 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 15:01:23.724549 6248 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 15:01:23.724554 6248 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 15:01:23.724598 6248 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 15:01:23.724599 6248 factory.go:656] Stopping watch factory\\\\nI1201 15:01:23.724608 6248 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 15:01:23.724609 6248 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 15:01:23.724614 6248 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 15:01:23.724630 6248 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 15:01:23.724637 6248 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"ed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z]\\\\nI1201 15:01:25.433518 6382 services_controller.go:452] Built service openshift-console/console per-node LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433495 6382 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 15:01:25.433527 6382 services_controller.go:453] Built service openshift-console/console template LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433537 6382 services_controller.go:454] S\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.735072 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.760629 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.763980 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.764035 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.771588 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.771621 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.771632 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:26Z","lastTransitionTime":"2025-12-01T15:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.782951 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.797966 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:26Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.874333 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.874370 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.874379 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.874416 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.874428 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:26Z","lastTransitionTime":"2025-12-01T15:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.977126 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.977206 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.977226 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.977255 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:26 crc kubenswrapper[4931]: I1201 15:01:26.977276 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:26Z","lastTransitionTime":"2025-12-01T15:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.081128 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.081201 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.081218 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.081249 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.081270 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:27Z","lastTransitionTime":"2025-12-01T15:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.185178 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.185279 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.185309 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.185356 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.185456 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:27Z","lastTransitionTime":"2025-12-01T15:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.240526 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.240749 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.289278 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.289417 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.289437 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.289462 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.289491 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:27Z","lastTransitionTime":"2025-12-01T15:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.356960 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp"] Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.357976 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.362063 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.362943 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.384020 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.392277 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.392329 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.392342 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.392364 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.392403 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:27Z","lastTransitionTime":"2025-12-01T15:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.406855 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.427119 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.443524 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.458160 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqthv\" (UniqueName: \"kubernetes.io/projected/e1a8fdf6-a549-4875-9712-bab1069cfd7d-kube-api-access-nqthv\") pod \"ovnkube-control-plane-749d76644c-t4vqp\" (UID: \"e1a8fdf6-a549-4875-9712-bab1069cfd7d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.458263 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1a8fdf6-a549-4875-9712-bab1069cfd7d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t4vqp\" (UID: \"e1a8fdf6-a549-4875-9712-bab1069cfd7d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.458312 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1a8fdf6-a549-4875-9712-bab1069cfd7d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t4vqp\" (UID: \"e1a8fdf6-a549-4875-9712-bab1069cfd7d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.458355 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1a8fdf6-a549-4875-9712-bab1069cfd7d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t4vqp\" (UID: \"e1a8fdf6-a549-4875-9712-bab1069cfd7d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.458906 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.484356 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4c617579ec2978a3019775fbf8f550aea2c98dec7e877416c3ffabc4c8dd6cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:23Z\\\",\\\"message\\\":\\\"d (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 15:01:23.724456 6248 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 15:01:23.724476 6248 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 15:01:23.724499 6248 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 15:01:23.724509 6248 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 15:01:23.724525 6248 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1201 15:01:23.724537 6248 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 15:01:23.724549 6248 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 15:01:23.724554 6248 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 15:01:23.724598 6248 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1201 15:01:23.724599 6248 factory.go:656] Stopping watch factory\\\\nI1201 15:01:23.724608 6248 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 15:01:23.724609 6248 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 15:01:23.724614 6248 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 15:01:23.724630 6248 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 15:01:23.724637 6248 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"ed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z]\\\\nI1201 15:01:25.433518 6382 services_controller.go:452] Built service openshift-console/console per-node LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433495 6382 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 15:01:25.433527 6382 services_controller.go:453] Built service openshift-console/console template LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433537 6382 services_controller.go:454] S\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.489603 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/1.log" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.494613 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.494680 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.494698 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.494789 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.494868 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:27Z","lastTransitionTime":"2025-12-01T15:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.495538 4931 scope.go:117] "RemoveContainer" containerID="4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193" Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.495816 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.510742 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.533930 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.549162 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.559472 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1a8fdf6-a549-4875-9712-bab1069cfd7d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t4vqp\" (UID: \"e1a8fdf6-a549-4875-9712-bab1069cfd7d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.559535 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1a8fdf6-a549-4875-9712-bab1069cfd7d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t4vqp\" (UID: \"e1a8fdf6-a549-4875-9712-bab1069cfd7d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.559594 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1a8fdf6-a549-4875-9712-bab1069cfd7d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t4vqp\" (UID: \"e1a8fdf6-a549-4875-9712-bab1069cfd7d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.559717 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqthv\" (UniqueName: \"kubernetes.io/projected/e1a8fdf6-a549-4875-9712-bab1069cfd7d-kube-api-access-nqthv\") pod \"ovnkube-control-plane-749d76644c-t4vqp\" (UID: \"e1a8fdf6-a549-4875-9712-bab1069cfd7d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.560534 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1a8fdf6-a549-4875-9712-bab1069cfd7d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t4vqp\" (UID: \"e1a8fdf6-a549-4875-9712-bab1069cfd7d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.560839 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1a8fdf6-a549-4875-9712-bab1069cfd7d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t4vqp\" (UID: \"e1a8fdf6-a549-4875-9712-bab1069cfd7d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.564732 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.572720 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1a8fdf6-a549-4875-9712-bab1069cfd7d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t4vqp\" (UID: \"e1a8fdf6-a549-4875-9712-bab1069cfd7d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.578202 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqthv\" (UniqueName: \"kubernetes.io/projected/e1a8fdf6-a549-4875-9712-bab1069cfd7d-kube-api-access-nqthv\") pod \"ovnkube-control-plane-749d76644c-t4vqp\" (UID: \"e1a8fdf6-a549-4875-9712-bab1069cfd7d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.591869 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.598228 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.598285 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.598296 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.598310 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.598319 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:27Z","lastTransitionTime":"2025-12-01T15:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.619531 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.637236 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.653951 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.669713 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.682065 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.685764 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" Dec 01 15:01:27 crc kubenswrapper[4931]: W1201 15:01:27.701437 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1a8fdf6_a549_4875_9712_bab1069cfd7d.slice/crio-d5be9c33c627513eb66e91e27db43fcb8c5fa6d6a008eee446bf8848e8f29a42 WatchSource:0}: Error finding container d5be9c33c627513eb66e91e27db43fcb8c5fa6d6a008eee446bf8848e8f29a42: Status 404 returned error can't find the container with id d5be9c33c627513eb66e91e27db43fcb8c5fa6d6a008eee446bf8848e8f29a42 Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.701541 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.701590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.701602 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.701617 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.701627 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:27Z","lastTransitionTime":"2025-12-01T15:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.709439 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.728203 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.741113 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.754620 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.764716 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.775865 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.789199 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.805244 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.805295 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.805308 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.805326 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.805341 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:27Z","lastTransitionTime":"2025-12-01T15:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.815493 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.830371 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.841473 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.851747 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.872575 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"ed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z]\\\\nI1201 15:01:25.433518 6382 services_controller.go:452] Built service openshift-console/console per-node LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433495 6382 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 15:01:25.433527 6382 services_controller.go:453] Built service openshift-console/console template LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433537 6382 services_controller.go:454] S\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.886453 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.900020 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.907314 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.907339 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.907347 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.907362 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.907371 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:27Z","lastTransitionTime":"2025-12-01T15:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.914236 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.947639 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.962748 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.962873 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.962900 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.962923 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:27 crc kubenswrapper[4931]: I1201 15:01:27.962950 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.963086 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.963140 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:43.963126433 +0000 UTC m=+50.389000090 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.963503 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:01:43.963480833 +0000 UTC m=+50.389354500 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.963575 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.963586 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.963596 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.963620 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:43.963614067 +0000 UTC m=+50.389487734 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.963660 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.963670 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.963676 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.963696 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:43.963688459 +0000 UTC m=+50.389562126 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.963722 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:27 crc kubenswrapper[4931]: E1201 15:01:27.963741 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:43.963735871 +0000 UTC m=+50.389609538 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.009957 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.010009 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.010020 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.010036 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.010047 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:28Z","lastTransitionTime":"2025-12-01T15:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.102716 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-78dk9"] Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.103470 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:28 crc kubenswrapper[4931]: E1201 15:01:28.103546 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.112846 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.112903 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.112921 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.112946 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.112963 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:28Z","lastTransitionTime":"2025-12-01T15:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.123235 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.142654 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.161987 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.164691 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.164764 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7tmh\" (UniqueName: \"kubernetes.io/projected/2e105961-27de-4865-bd7b-44dd04d12034-kube-api-access-v7tmh\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.179020 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.196201 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.213529 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.215310 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.215405 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.215428 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.215459 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.215478 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:28Z","lastTransitionTime":"2025-12-01T15:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.225863 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.239205 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.241705 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.241766 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:28 crc kubenswrapper[4931]: E1201 15:01:28.241857 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:28 crc kubenswrapper[4931]: E1201 15:01:28.241965 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.266039 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.266129 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7tmh\" (UniqueName: \"kubernetes.io/projected/2e105961-27de-4865-bd7b-44dd04d12034-kube-api-access-v7tmh\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:28 crc kubenswrapper[4931]: E1201 15:01:28.266359 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:01:28 crc kubenswrapper[4931]: E1201 15:01:28.266481 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs podName:2e105961-27de-4865-bd7b-44dd04d12034 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:28.766456251 +0000 UTC m=+35.192329928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs") pod "network-metrics-daemon-78dk9" (UID: "2e105961-27de-4865-bd7b-44dd04d12034") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.266488 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.278925 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.287092 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7tmh\" (UniqueName: \"kubernetes.io/projected/2e105961-27de-4865-bd7b-44dd04d12034-kube-api-access-v7tmh\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.288334 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.298486 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.316021 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"ed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z]\\\\nI1201 15:01:25.433518 6382 services_controller.go:452] Built service openshift-console/console per-node LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433495 6382 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 15:01:25.433527 6382 services_controller.go:453] Built service openshift-console/console template LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433537 6382 services_controller.go:454] S\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.318869 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.318920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.318931 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.318947 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.318958 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:28Z","lastTransitionTime":"2025-12-01T15:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.327947 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.340450 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.354746 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.367942 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.421573 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.421613 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.421622 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.421638 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.421647 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:28Z","lastTransitionTime":"2025-12-01T15:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.499615 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" event={"ID":"e1a8fdf6-a549-4875-9712-bab1069cfd7d","Type":"ContainerStarted","Data":"640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd"} Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.499671 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" event={"ID":"e1a8fdf6-a549-4875-9712-bab1069cfd7d","Type":"ContainerStarted","Data":"025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f"} Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.499681 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" event={"ID":"e1a8fdf6-a549-4875-9712-bab1069cfd7d","Type":"ContainerStarted","Data":"d5be9c33c627513eb66e91e27db43fcb8c5fa6d6a008eee446bf8848e8f29a42"} Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.511521 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.521086 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.523542 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.523593 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.523605 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.523625 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.523638 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:28Z","lastTransitionTime":"2025-12-01T15:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.531868 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.550563 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.565029 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.579896 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.589952 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.607490 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"ed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z]\\\\nI1201 15:01:25.433518 6382 services_controller.go:452] Built service openshift-console/console per-node LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433495 6382 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 15:01:25.433527 6382 services_controller.go:453] Built service openshift-console/console template LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433537 6382 services_controller.go:454] S\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.618417 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.626396 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.626442 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.626452 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.626466 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.626478 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:28Z","lastTransitionTime":"2025-12-01T15:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.632026 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.643090 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.655405 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.666170 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.678585 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.700918 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.721532 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.729412 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.729482 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.729501 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.729529 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.729547 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:28Z","lastTransitionTime":"2025-12-01T15:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.740857 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.770481 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:28 crc kubenswrapper[4931]: E1201 15:01:28.770704 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:01:28 crc kubenswrapper[4931]: E1201 15:01:28.770833 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs podName:2e105961-27de-4865-bd7b-44dd04d12034 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:29.770804512 +0000 UTC m=+36.196678209 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs") pod "network-metrics-daemon-78dk9" (UID: "2e105961-27de-4865-bd7b-44dd04d12034") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.832965 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.833014 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.833029 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.833048 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.833060 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:28Z","lastTransitionTime":"2025-12-01T15:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.936522 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.936586 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.936606 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.936632 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:28 crc kubenswrapper[4931]: I1201 15:01:28.936650 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:28Z","lastTransitionTime":"2025-12-01T15:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.040786 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.040855 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.040873 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.040901 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.040922 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:29Z","lastTransitionTime":"2025-12-01T15:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.144750 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.144814 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.144832 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.144855 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.144873 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:29Z","lastTransitionTime":"2025-12-01T15:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.241337 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:29 crc kubenswrapper[4931]: E1201 15:01:29.241599 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.247796 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.247870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.247890 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.247916 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.247937 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:29Z","lastTransitionTime":"2025-12-01T15:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.350608 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.350674 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.350692 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.350716 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.350735 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:29Z","lastTransitionTime":"2025-12-01T15:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.454073 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.454143 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.454163 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.454196 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.454215 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:29Z","lastTransitionTime":"2025-12-01T15:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.557792 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.558242 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.558260 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.558285 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.558307 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:29Z","lastTransitionTime":"2025-12-01T15:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.661580 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.661647 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.661666 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.661690 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.661708 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:29Z","lastTransitionTime":"2025-12-01T15:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.764889 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.764950 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.764963 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.764982 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.764995 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:29Z","lastTransitionTime":"2025-12-01T15:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.784646 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:29 crc kubenswrapper[4931]: E1201 15:01:29.784902 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:01:29 crc kubenswrapper[4931]: E1201 15:01:29.785004 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs podName:2e105961-27de-4865-bd7b-44dd04d12034 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:31.784979227 +0000 UTC m=+38.210852954 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs") pod "network-metrics-daemon-78dk9" (UID: "2e105961-27de-4865-bd7b-44dd04d12034") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.868146 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.868546 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.868719 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.868868 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.869029 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:29Z","lastTransitionTime":"2025-12-01T15:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.972362 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.972442 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.972453 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.972472 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:29 crc kubenswrapper[4931]: I1201 15:01:29.972484 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:29Z","lastTransitionTime":"2025-12-01T15:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.075301 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.075330 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.075339 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.075352 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.075361 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:30Z","lastTransitionTime":"2025-12-01T15:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.178176 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.178235 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.178249 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.178267 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.178279 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:30Z","lastTransitionTime":"2025-12-01T15:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.241362 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.241505 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.241606 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:30 crc kubenswrapper[4931]: E1201 15:01:30.241749 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:30 crc kubenswrapper[4931]: E1201 15:01:30.241896 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:30 crc kubenswrapper[4931]: E1201 15:01:30.241978 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.281100 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.281321 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.281421 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.281498 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.282161 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:30Z","lastTransitionTime":"2025-12-01T15:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.385775 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.386453 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.386634 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.386780 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.387087 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:30Z","lastTransitionTime":"2025-12-01T15:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.490572 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.490641 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.490660 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.490690 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.490710 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:30Z","lastTransitionTime":"2025-12-01T15:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.594345 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.594415 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.594424 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.594441 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.594451 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:30Z","lastTransitionTime":"2025-12-01T15:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.697459 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.697516 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.697528 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.697545 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.697564 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:30Z","lastTransitionTime":"2025-12-01T15:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.800668 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.800750 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.800768 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.800803 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.800823 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:30Z","lastTransitionTime":"2025-12-01T15:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.904582 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.904634 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.904650 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.904676 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:30 crc kubenswrapper[4931]: I1201 15:01:30.904694 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:30Z","lastTransitionTime":"2025-12-01T15:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.008351 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.008507 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.008532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.008564 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.008589 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:31Z","lastTransitionTime":"2025-12-01T15:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.112610 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.112675 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.112699 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.112776 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.112828 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:31Z","lastTransitionTime":"2025-12-01T15:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.215590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.215656 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.215673 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.215699 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.215717 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:31Z","lastTransitionTime":"2025-12-01T15:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.240901 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:31 crc kubenswrapper[4931]: E1201 15:01:31.241073 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.318669 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.318721 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.318732 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.318751 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.318764 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:31Z","lastTransitionTime":"2025-12-01T15:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.421159 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.421218 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.421230 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.421256 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.421268 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:31Z","lastTransitionTime":"2025-12-01T15:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.523820 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.523881 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.523893 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.523913 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.523926 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:31Z","lastTransitionTime":"2025-12-01T15:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.626731 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.626784 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.626796 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.626815 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.626829 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:31Z","lastTransitionTime":"2025-12-01T15:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.729259 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.729320 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.729337 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.729361 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.729415 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:31Z","lastTransitionTime":"2025-12-01T15:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.808452 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:31 crc kubenswrapper[4931]: E1201 15:01:31.808604 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:01:31 crc kubenswrapper[4931]: E1201 15:01:31.808656 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs podName:2e105961-27de-4865-bd7b-44dd04d12034 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:35.808642229 +0000 UTC m=+42.234515896 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs") pod "network-metrics-daemon-78dk9" (UID: "2e105961-27de-4865-bd7b-44dd04d12034") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.832057 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.832115 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.832133 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.832154 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.832169 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:31Z","lastTransitionTime":"2025-12-01T15:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.935200 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.935263 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.935282 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.935306 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:31 crc kubenswrapper[4931]: I1201 15:01:31.935322 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:31Z","lastTransitionTime":"2025-12-01T15:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.037345 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.037407 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.037416 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.037431 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.037440 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:32Z","lastTransitionTime":"2025-12-01T15:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.140791 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.140860 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.140880 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.140904 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.140922 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:32Z","lastTransitionTime":"2025-12-01T15:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.241151 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.241197 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:32 crc kubenswrapper[4931]: E1201 15:01:32.241372 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.242028 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:32 crc kubenswrapper[4931]: E1201 15:01:32.242280 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:32 crc kubenswrapper[4931]: E1201 15:01:32.241972 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.243662 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.243706 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.243723 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.243747 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.243765 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:32Z","lastTransitionTime":"2025-12-01T15:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.346159 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.346212 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.346228 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.346251 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.346269 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:32Z","lastTransitionTime":"2025-12-01T15:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.449463 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.449529 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.449551 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.449579 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.449600 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:32Z","lastTransitionTime":"2025-12-01T15:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.552600 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.552665 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.552685 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.552710 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.552917 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:32Z","lastTransitionTime":"2025-12-01T15:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.655859 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.655897 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.655907 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.655921 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.655929 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:32Z","lastTransitionTime":"2025-12-01T15:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.759080 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.759153 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.759172 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.759201 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.759228 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:32Z","lastTransitionTime":"2025-12-01T15:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.863586 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.864119 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.864192 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.864258 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.864342 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:32Z","lastTransitionTime":"2025-12-01T15:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.968063 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.968523 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.968734 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.968968 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:32 crc kubenswrapper[4931]: I1201 15:01:32.969171 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:32Z","lastTransitionTime":"2025-12-01T15:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.072235 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.072297 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.072309 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.072328 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.072339 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:33Z","lastTransitionTime":"2025-12-01T15:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.175060 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.175101 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.175109 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.175123 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.175132 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:33Z","lastTransitionTime":"2025-12-01T15:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.241068 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:33 crc kubenswrapper[4931]: E1201 15:01:33.241213 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.280587 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.280668 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.280710 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.280748 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.280775 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:33Z","lastTransitionTime":"2025-12-01T15:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.383705 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.383800 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.383820 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.383852 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.383874 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:33Z","lastTransitionTime":"2025-12-01T15:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.487084 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.487158 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.487177 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.487206 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.487228 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:33Z","lastTransitionTime":"2025-12-01T15:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.591081 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.591161 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.591187 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.591221 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.591249 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:33Z","lastTransitionTime":"2025-12-01T15:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.695262 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.695340 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.695435 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.695476 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.695499 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:33Z","lastTransitionTime":"2025-12-01T15:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.798115 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.798169 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.798186 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.798212 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.798233 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:33Z","lastTransitionTime":"2025-12-01T15:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.902743 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.902883 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.902907 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.902986 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:33 crc kubenswrapper[4931]: I1201 15:01:33.903009 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:33Z","lastTransitionTime":"2025-12-01T15:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.006028 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.006116 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.006140 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.006178 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.006198 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:34Z","lastTransitionTime":"2025-12-01T15:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.119503 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.119590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.119619 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.119651 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.119676 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:34Z","lastTransitionTime":"2025-12-01T15:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.223096 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.223168 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.223188 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.223217 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.223237 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:34Z","lastTransitionTime":"2025-12-01T15:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.240730 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.240746 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:34 crc kubenswrapper[4931]: E1201 15:01:34.240944 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.241009 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:34 crc kubenswrapper[4931]: E1201 15:01:34.241064 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:34 crc kubenswrapper[4931]: E1201 15:01:34.241294 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.259518 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.274774 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.295542 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.314132 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.326097 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.326152 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.326169 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.326191 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.326206 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:34Z","lastTransitionTime":"2025-12-01T15:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.354962 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.374022 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.390869 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.408473 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.428636 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.428682 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.428694 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.428712 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.428752 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:34Z","lastTransitionTime":"2025-12-01T15:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.443107 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"ed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z]\\\\nI1201 15:01:25.433518 6382 services_controller.go:452] Built service openshift-console/console per-node LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433495 6382 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 15:01:25.433527 6382 services_controller.go:453] Built service openshift-console/console template LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433537 6382 services_controller.go:454] S\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.462742 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.479933 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.495902 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.512151 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.530964 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.533380 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.533536 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.533634 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.533726 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.533814 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:34Z","lastTransitionTime":"2025-12-01T15:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.549560 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.569228 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.583363 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.637562 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.637634 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.637660 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.637699 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.637723 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:34Z","lastTransitionTime":"2025-12-01T15:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.741097 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.741135 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.741147 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.741164 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.741175 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:34Z","lastTransitionTime":"2025-12-01T15:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.843956 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.844004 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.844015 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.844030 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.844043 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:34Z","lastTransitionTime":"2025-12-01T15:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.947913 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.947964 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.947975 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.947997 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:34 crc kubenswrapper[4931]: I1201 15:01:34.948013 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:34Z","lastTransitionTime":"2025-12-01T15:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.050694 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.050772 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.050794 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.050822 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.050841 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:35Z","lastTransitionTime":"2025-12-01T15:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.154452 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.154535 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.154555 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.154584 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.154604 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:35Z","lastTransitionTime":"2025-12-01T15:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.240820 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:35 crc kubenswrapper[4931]: E1201 15:01:35.241084 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.258281 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.258340 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.258355 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.258378 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.258410 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:35Z","lastTransitionTime":"2025-12-01T15:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.361283 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.361331 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.361345 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.361365 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.361380 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:35Z","lastTransitionTime":"2025-12-01T15:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.465245 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.465296 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.465313 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.465335 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.465352 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:35Z","lastTransitionTime":"2025-12-01T15:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.569614 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.569695 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.569719 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.569754 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.569778 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:35Z","lastTransitionTime":"2025-12-01T15:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.675759 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.676683 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.676741 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.676774 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.676799 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:35Z","lastTransitionTime":"2025-12-01T15:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.780447 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.780487 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.780499 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.780516 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.780528 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:35Z","lastTransitionTime":"2025-12-01T15:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.856641 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:35 crc kubenswrapper[4931]: E1201 15:01:35.856813 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:01:35 crc kubenswrapper[4931]: E1201 15:01:35.856864 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs podName:2e105961-27de-4865-bd7b-44dd04d12034 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:43.856849628 +0000 UTC m=+50.282723295 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs") pod "network-metrics-daemon-78dk9" (UID: "2e105961-27de-4865-bd7b-44dd04d12034") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.883330 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.883428 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.883446 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.883468 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.883492 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:35Z","lastTransitionTime":"2025-12-01T15:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.986638 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.986705 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.986722 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.986745 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:35 crc kubenswrapper[4931]: I1201 15:01:35.986765 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:35Z","lastTransitionTime":"2025-12-01T15:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.089773 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.089816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.089827 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.089845 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.089855 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:36Z","lastTransitionTime":"2025-12-01T15:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.192731 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.192794 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.192816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.192839 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.192860 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:36Z","lastTransitionTime":"2025-12-01T15:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.240744 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.240827 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:36 crc kubenswrapper[4931]: E1201 15:01:36.240872 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.240905 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:36 crc kubenswrapper[4931]: E1201 15:01:36.240985 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:36 crc kubenswrapper[4931]: E1201 15:01:36.241097 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.277789 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.277846 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.277858 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.277878 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.277895 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:36Z","lastTransitionTime":"2025-12-01T15:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:36 crc kubenswrapper[4931]: E1201 15:01:36.296558 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:36Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.311996 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.312043 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.312053 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.312076 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.312087 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:36Z","lastTransitionTime":"2025-12-01T15:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:36 crc kubenswrapper[4931]: E1201 15:01:36.326146 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:36Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.331166 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.331228 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.331238 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.331256 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.331267 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:36Z","lastTransitionTime":"2025-12-01T15:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:36 crc kubenswrapper[4931]: E1201 15:01:36.347692 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:36Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.352676 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.352722 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.352733 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.352755 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.352768 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:36Z","lastTransitionTime":"2025-12-01T15:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:36 crc kubenswrapper[4931]: E1201 15:01:36.366804 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:36Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.370570 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.370635 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.370654 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.370679 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.370700 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:36Z","lastTransitionTime":"2025-12-01T15:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:36 crc kubenswrapper[4931]: E1201 15:01:36.385702 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:36Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:36 crc kubenswrapper[4931]: E1201 15:01:36.385928 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.387489 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.387523 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.387535 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.387553 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.387566 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:36Z","lastTransitionTime":"2025-12-01T15:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.490785 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.490854 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.490865 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.490882 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.490895 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:36Z","lastTransitionTime":"2025-12-01T15:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.594050 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.594084 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.594091 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.594108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.594117 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:36Z","lastTransitionTime":"2025-12-01T15:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.697243 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.697276 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.697286 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.697300 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.697310 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:36Z","lastTransitionTime":"2025-12-01T15:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.800441 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.800480 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.800489 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.800503 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.800512 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:36Z","lastTransitionTime":"2025-12-01T15:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.903193 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.903242 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.903253 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.903275 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:36 crc kubenswrapper[4931]: I1201 15:01:36.903289 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:36Z","lastTransitionTime":"2025-12-01T15:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.005871 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.005910 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.005919 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.005934 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.005943 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:37Z","lastTransitionTime":"2025-12-01T15:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.108676 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.108728 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.108742 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.108766 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.108783 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:37Z","lastTransitionTime":"2025-12-01T15:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.211469 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.211514 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.211523 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.211537 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.211546 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:37Z","lastTransitionTime":"2025-12-01T15:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.241060 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:37 crc kubenswrapper[4931]: E1201 15:01:37.241268 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.315119 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.315200 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.315226 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.315256 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.315282 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:37Z","lastTransitionTime":"2025-12-01T15:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.418589 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.418674 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.418702 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.418737 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.418765 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:37Z","lastTransitionTime":"2025-12-01T15:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.521537 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.521620 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.521638 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.521665 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.521682 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:37Z","lastTransitionTime":"2025-12-01T15:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.624025 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.624085 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.624102 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.624125 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.624141 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:37Z","lastTransitionTime":"2025-12-01T15:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.727626 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.727723 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.727745 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.727771 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.727789 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:37Z","lastTransitionTime":"2025-12-01T15:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.830920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.830962 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.830973 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.830989 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.831003 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:37Z","lastTransitionTime":"2025-12-01T15:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.933475 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.933533 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.933550 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.933573 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:37 crc kubenswrapper[4931]: I1201 15:01:37.933591 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:37Z","lastTransitionTime":"2025-12-01T15:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.035620 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.035695 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.035717 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.035745 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.035767 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:38Z","lastTransitionTime":"2025-12-01T15:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.138865 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.138903 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.138912 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.138925 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.138934 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:38Z","lastTransitionTime":"2025-12-01T15:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.240548 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.240606 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.240577 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:38 crc kubenswrapper[4931]: E1201 15:01:38.240820 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:38 crc kubenswrapper[4931]: E1201 15:01:38.241030 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:38 crc kubenswrapper[4931]: E1201 15:01:38.241252 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.242313 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.242429 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.242456 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.242487 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.242509 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:38Z","lastTransitionTime":"2025-12-01T15:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.346558 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.346632 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.346650 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.347041 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.347093 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:38Z","lastTransitionTime":"2025-12-01T15:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.450681 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.450765 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.450788 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.450814 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.450834 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:38Z","lastTransitionTime":"2025-12-01T15:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.553871 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.553981 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.554038 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.554067 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.554086 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:38Z","lastTransitionTime":"2025-12-01T15:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.657272 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.657330 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.657343 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.657366 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.657401 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:38Z","lastTransitionTime":"2025-12-01T15:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.760172 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.760248 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.760259 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.760278 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.760291 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:38Z","lastTransitionTime":"2025-12-01T15:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.862843 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.862879 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.862888 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.862903 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.862914 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:38Z","lastTransitionTime":"2025-12-01T15:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.965947 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.965997 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.966006 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.966021 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:38 crc kubenswrapper[4931]: I1201 15:01:38.966031 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:38Z","lastTransitionTime":"2025-12-01T15:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.068537 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.068585 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.068597 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.068614 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.068627 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:39Z","lastTransitionTime":"2025-12-01T15:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.171369 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.171443 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.171459 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.171482 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.171496 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:39Z","lastTransitionTime":"2025-12-01T15:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.240603 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:39 crc kubenswrapper[4931]: E1201 15:01:39.240786 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.274244 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.274311 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.274330 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.274359 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.274378 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:39Z","lastTransitionTime":"2025-12-01T15:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.376654 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.376743 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.376761 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.376788 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.376808 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:39Z","lastTransitionTime":"2025-12-01T15:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.479244 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.479306 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.479325 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.479349 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.479368 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:39Z","lastTransitionTime":"2025-12-01T15:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.582577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.582629 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.582642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.582663 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.582680 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:39Z","lastTransitionTime":"2025-12-01T15:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.686101 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.686172 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.686192 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.686219 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.686242 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:39Z","lastTransitionTime":"2025-12-01T15:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.789461 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.789542 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.789561 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.789590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.789609 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:39Z","lastTransitionTime":"2025-12-01T15:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.892646 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.892748 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.892774 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.892807 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.892831 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:39Z","lastTransitionTime":"2025-12-01T15:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.996349 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.996485 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.996508 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.996573 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:39 crc kubenswrapper[4931]: I1201 15:01:39.996593 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:39Z","lastTransitionTime":"2025-12-01T15:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.099735 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.099802 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.099823 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.099854 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.099880 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:40Z","lastTransitionTime":"2025-12-01T15:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.202941 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.203012 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.203024 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.203050 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.203064 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:40Z","lastTransitionTime":"2025-12-01T15:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.241524 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.241575 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.241537 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:40 crc kubenswrapper[4931]: E1201 15:01:40.241778 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:40 crc kubenswrapper[4931]: E1201 15:01:40.241940 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:40 crc kubenswrapper[4931]: E1201 15:01:40.242057 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.307210 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.307282 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.307304 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.307332 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.307351 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:40Z","lastTransitionTime":"2025-12-01T15:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.410865 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.410945 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.410968 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.410994 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.411019 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:40Z","lastTransitionTime":"2025-12-01T15:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.515341 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.515478 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.515498 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.515580 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.515609 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:40Z","lastTransitionTime":"2025-12-01T15:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.619841 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.619914 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.619933 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.619962 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.619982 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:40Z","lastTransitionTime":"2025-12-01T15:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.723993 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.724070 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.724089 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.724120 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.724141 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:40Z","lastTransitionTime":"2025-12-01T15:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.827876 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.827953 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.827972 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.828004 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.828024 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:40Z","lastTransitionTime":"2025-12-01T15:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.931784 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.931857 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.931873 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.931900 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:40 crc kubenswrapper[4931]: I1201 15:01:40.931920 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:40Z","lastTransitionTime":"2025-12-01T15:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.034723 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.034801 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.034825 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.034850 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.034868 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:41Z","lastTransitionTime":"2025-12-01T15:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.137827 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.137902 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.137921 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.137959 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.137980 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:41Z","lastTransitionTime":"2025-12-01T15:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.240471 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:41 crc kubenswrapper[4931]: E1201 15:01:41.240681 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.241637 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.241689 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.241703 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.241722 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.241734 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:41Z","lastTransitionTime":"2025-12-01T15:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.244849 4931 scope.go:117] "RemoveContainer" containerID="4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.345442 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.345819 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.345843 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.345874 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.345894 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:41Z","lastTransitionTime":"2025-12-01T15:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.395130 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.406680 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.411631 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.430426 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.448697 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.449050 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.449095 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.449106 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.449121 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.449130 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:41Z","lastTransitionTime":"2025-12-01T15:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.464191 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.481481 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.498740 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.515296 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.528282 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.549803 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/1.log" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.551917 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.551945 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.551953 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.551988 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.551997 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:41Z","lastTransitionTime":"2025-12-01T15:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.551894 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.560195 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerStarted","Data":"b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9"} Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.560680 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.564816 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.577572 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.589423 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.614280 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"ed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z]\\\\nI1201 15:01:25.433518 6382 services_controller.go:452] Built service openshift-console/console per-node LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433495 6382 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 15:01:25.433527 6382 services_controller.go:453] Built service openshift-console/console template LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433537 6382 services_controller.go:454] S\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.634164 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.651464 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.654494 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.654517 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.654526 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.654540 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.654550 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:41Z","lastTransitionTime":"2025-12-01T15:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.672627 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.689057 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.724721 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.744290 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.756836 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.756890 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.756903 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.756921 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.756936 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:41Z","lastTransitionTime":"2025-12-01T15:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.758462 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.771925 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.796608 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.810871 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.827091 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.841890 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.852676 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.858883 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.858940 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.858957 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.858978 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.858990 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:41Z","lastTransitionTime":"2025-12-01T15:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.870891 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.887446 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.904073 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.923802 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.937666 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.953313 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:41Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.961722 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.961775 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.961784 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.961803 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:41 crc kubenswrapper[4931]: I1201 15:01:41.961814 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:41Z","lastTransitionTime":"2025-12-01T15:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.005272 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"ed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z]\\\\nI1201 15:01:25.433518 6382 services_controller.go:452] Built service openshift-console/console per-node LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433495 6382 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 15:01:25.433527 6382 services_controller.go:453] Built service openshift-console/console template LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433537 6382 services_controller.go:454] S\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.026942 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.047639 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.064704 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.064761 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.064781 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.064805 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.064824 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:42Z","lastTransitionTime":"2025-12-01T15:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.167512 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.167574 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.167590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.167616 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.167630 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:42Z","lastTransitionTime":"2025-12-01T15:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.241455 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.241496 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.241588 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:42 crc kubenswrapper[4931]: E1201 15:01:42.242109 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:42 crc kubenswrapper[4931]: E1201 15:01:42.242174 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:42 crc kubenswrapper[4931]: E1201 15:01:42.241879 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.270883 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.270951 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.270968 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.270992 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.271012 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:42Z","lastTransitionTime":"2025-12-01T15:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.374172 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.374254 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.374274 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.374301 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.374320 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:42Z","lastTransitionTime":"2025-12-01T15:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.477368 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.477453 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.477471 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.477499 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.477518 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:42Z","lastTransitionTime":"2025-12-01T15:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.566593 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/2.log" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.567748 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/1.log" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.571376 4931 generic.go:334] "Generic (PLEG): container finished" podID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerID="b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9" exitCode=1 Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.571434 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9"} Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.571480 4931 scope.go:117] "RemoveContainer" containerID="4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.574587 4931 scope.go:117] "RemoveContainer" containerID="b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9" Dec 01 15:01:42 crc kubenswrapper[4931]: E1201 15:01:42.575016 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.581762 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.581787 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.581795 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.581807 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.581817 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:42Z","lastTransitionTime":"2025-12-01T15:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.589696 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.605551 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.629691 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.650955 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.672431 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.684542 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.684595 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.684619 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.684646 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.684665 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:42Z","lastTransitionTime":"2025-12-01T15:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.695833 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.715036 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.730071 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.746649 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.765991 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.787954 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.795606 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.795805 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.795939 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.796123 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.796248 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:42Z","lastTransitionTime":"2025-12-01T15:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.804951 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.822236 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.854347 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4500f4c1af00a3ab31b228d936a06ee04be1b13cc35e0575bb6ae65504849193\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:25Z\\\",\\\"message\\\":\\\"ed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:25Z is after 2025-08-24T17:21:41Z]\\\\nI1201 15:01:25.433518 6382 services_controller.go:452] Built service openshift-console/console per-node LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433495 6382 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 15:01:25.433527 6382 services_controller.go:453] Built service openshift-console/console template LB for network=default: []services.LB{}\\\\nI1201 15:01:25.433537 6382 services_controller.go:454] S\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:42Z\\\",\\\"message\\\":\\\"obj_retry.go:551] Creating *factory.egressNode crc took: 10.993708ms\\\\nI1201 15:01:42.136821 6600 factory.go:1336] Added *v1.Node event handler 7\\\\nI1201 15:01:42.136888 6600 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 15:01:42.136907 6600 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 15:01:42.136950 6600 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 15:01:42.136961 6600 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 15:01:42.136982 6600 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1201 15:01:42.136984 6600 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 15:01:42.137016 6600 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 15:01:42.137009 6600 factory.go:656] Stopping watch factory\\\\nI1201 15:01:42.137027 6600 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 15:01:42.137031 6600 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 15:01:42.137436 6600 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 15:01:42.137572 6600 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 15:01:42.137648 6600 ovnkube.go:599] Stopped ovnkube\\\\nI1201 15:01:42.137691 6600 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 15:01:42.137785 6600 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.879512 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.899646 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.899715 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.899734 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.899762 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.899780 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:42Z","lastTransitionTime":"2025-12-01T15:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.901223 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.922077 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:42 crc kubenswrapper[4931]: I1201 15:01:42.944140 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:42Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.003155 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.003572 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.003746 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.003927 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.004073 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:43Z","lastTransitionTime":"2025-12-01T15:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.107627 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.108062 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.108323 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.108564 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.108822 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:43Z","lastTransitionTime":"2025-12-01T15:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.212403 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.212482 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.212496 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.212518 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.212532 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:43Z","lastTransitionTime":"2025-12-01T15:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.241312 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.241460 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.316142 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.316191 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.316204 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.316224 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.316239 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:43Z","lastTransitionTime":"2025-12-01T15:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.419466 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.419536 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.419555 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.419581 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.419599 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:43Z","lastTransitionTime":"2025-12-01T15:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.522587 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.522656 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.522672 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.522694 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.522711 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:43Z","lastTransitionTime":"2025-12-01T15:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.576886 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/2.log" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.581097 4931 scope.go:117] "RemoveContainer" containerID="b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9" Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.581272 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.598073 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.622957 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.625131 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.625176 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.625187 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.625206 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.625225 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:43Z","lastTransitionTime":"2025-12-01T15:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.641362 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.660734 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.676209 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.691229 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.701322 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.715070 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.727932 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.728301 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.728344 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.728356 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.728376 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.728408 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:43Z","lastTransitionTime":"2025-12-01T15:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.739363 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.757952 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.770198 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.783118 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.806432 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:42Z\\\",\\\"message\\\":\\\"obj_retry.go:551] Creating *factory.egressNode crc took: 10.993708ms\\\\nI1201 15:01:42.136821 6600 factory.go:1336] Added *v1.Node event handler 7\\\\nI1201 15:01:42.136888 6600 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 15:01:42.136907 6600 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 15:01:42.136950 6600 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 15:01:42.136961 6600 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 15:01:42.136982 6600 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1201 15:01:42.136984 6600 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 15:01:42.137016 6600 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 15:01:42.137009 6600 factory.go:656] Stopping watch factory\\\\nI1201 15:01:42.137027 6600 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 15:01:42.137031 6600 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 15:01:42.137436 6600 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 15:01:42.137572 6600 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 15:01:42.137648 6600 ovnkube.go:599] Stopped ovnkube\\\\nI1201 15:01:42.137691 6600 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 15:01:42.137785 6600 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.822146 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.830861 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.830922 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.830934 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.830953 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.830982 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:43Z","lastTransitionTime":"2025-12-01T15:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.848543 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.865622 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.872260 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.872422 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.872488 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs podName:2e105961-27de-4865-bd7b-44dd04d12034 nodeName:}" failed. No retries permitted until 2025-12-01 15:01:59.872471952 +0000 UTC m=+66.298345619 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs") pod "network-metrics-daemon-78dk9" (UID: "2e105961-27de-4865-bd7b-44dd04d12034") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.878126 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:43Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.934289 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.934421 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.934436 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.934453 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.934464 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:43Z","lastTransitionTime":"2025-12-01T15:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.973076 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.973218 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.973306 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:02:15.973254711 +0000 UTC m=+82.399128418 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.973364 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.973442 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.973462 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:02:15.973439177 +0000 UTC m=+82.399312944 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.973533 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.973562 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.973582 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.973573 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.973636 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 15:02:15.973619042 +0000 UTC m=+82.399492809 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:43 crc kubenswrapper[4931]: I1201 15:01:43.973666 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.973802 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.973810 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.973856 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:02:15.973844149 +0000 UTC m=+82.399717826 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.973859 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.973890 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:43 crc kubenswrapper[4931]: E1201 15:01:43.973970 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 15:02:15.973946962 +0000 UTC m=+82.399820679 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.037656 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.037711 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.037722 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.037743 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.037756 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:44Z","lastTransitionTime":"2025-12-01T15:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.141188 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.141249 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.141264 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.141285 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.141302 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:44Z","lastTransitionTime":"2025-12-01T15:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.240983 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.241074 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:44 crc kubenswrapper[4931]: E1201 15:01:44.241205 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.241226 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:44 crc kubenswrapper[4931]: E1201 15:01:44.241417 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:44 crc kubenswrapper[4931]: E1201 15:01:44.241547 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.245651 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.245719 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.245737 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.245761 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.245778 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:44Z","lastTransitionTime":"2025-12-01T15:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.259442 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.275134 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.292826 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.317854 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.342015 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.347832 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.347884 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.347901 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.347926 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.347945 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:44Z","lastTransitionTime":"2025-12-01T15:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.362361 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.374336 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.401826 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:42Z\\\",\\\"message\\\":\\\"obj_retry.go:551] Creating *factory.egressNode crc took: 10.993708ms\\\\nI1201 15:01:42.136821 6600 factory.go:1336] Added *v1.Node event handler 7\\\\nI1201 15:01:42.136888 6600 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 15:01:42.136907 6600 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 15:01:42.136950 6600 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 15:01:42.136961 6600 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 15:01:42.136982 6600 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1201 15:01:42.136984 6600 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 15:01:42.137016 6600 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 15:01:42.137009 6600 factory.go:656] Stopping watch factory\\\\nI1201 15:01:42.137027 6600 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 15:01:42.137031 6600 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 15:01:42.137436 6600 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 15:01:42.137572 6600 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 15:01:42.137648 6600 ovnkube.go:599] Stopped ovnkube\\\\nI1201 15:01:42.137691 6600 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 15:01:42.137785 6600 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.417640 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.432113 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.450696 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.451245 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.451370 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.451732 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.451771 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.451789 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:44Z","lastTransitionTime":"2025-12-01T15:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.469601 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.481766 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.493338 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.509160 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.526096 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.553222 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.554922 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.554992 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.555019 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.555058 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.555086 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:44Z","lastTransitionTime":"2025-12-01T15:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.568809 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:44Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.658620 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.658686 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.658707 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.658734 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.658752 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:44Z","lastTransitionTime":"2025-12-01T15:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.762028 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.762106 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.762125 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.762153 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.762172 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:44Z","lastTransitionTime":"2025-12-01T15:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.867634 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.867704 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.867723 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.867751 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.867770 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:44Z","lastTransitionTime":"2025-12-01T15:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.971623 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.971690 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.971714 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.971746 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:44 crc kubenswrapper[4931]: I1201 15:01:44.971769 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:44Z","lastTransitionTime":"2025-12-01T15:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.075594 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.075686 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.075712 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.075746 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.075778 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:45Z","lastTransitionTime":"2025-12-01T15:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.179042 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.179108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.179129 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.179155 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.179173 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:45Z","lastTransitionTime":"2025-12-01T15:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.241073 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:45 crc kubenswrapper[4931]: E1201 15:01:45.241252 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.282683 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.282754 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.282775 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.282805 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.282827 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:45Z","lastTransitionTime":"2025-12-01T15:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.385464 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.385516 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.385532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.385555 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.385570 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:45Z","lastTransitionTime":"2025-12-01T15:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.488327 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.488415 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.488434 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.488461 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.488479 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:45Z","lastTransitionTime":"2025-12-01T15:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.590878 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.590918 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.591038 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.591366 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.591412 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:45Z","lastTransitionTime":"2025-12-01T15:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.693875 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.693929 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.693941 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.693999 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.694016 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:45Z","lastTransitionTime":"2025-12-01T15:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.796768 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.796828 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.796846 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.796870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.796888 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:45Z","lastTransitionTime":"2025-12-01T15:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.899959 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.900338 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.900518 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.900669 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:45 crc kubenswrapper[4931]: I1201 15:01:45.900799 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:45Z","lastTransitionTime":"2025-12-01T15:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.003191 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.003255 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.003273 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.003298 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.003316 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.106878 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.106944 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.106959 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.106986 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.107004 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.209526 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.209606 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.209623 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.209682 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.209702 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.241088 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.241144 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:46 crc kubenswrapper[4931]: E1201 15:01:46.241287 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.241306 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:46 crc kubenswrapper[4931]: E1201 15:01:46.241541 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:46 crc kubenswrapper[4931]: E1201 15:01:46.241746 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.313398 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.313464 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.313483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.313506 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.313522 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.416219 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.416272 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.416286 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.416302 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.416313 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.519338 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.519420 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.519451 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.519471 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.519501 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.622095 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.622156 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.622177 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.622206 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.622225 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.683615 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.683987 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.684086 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.684173 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.684270 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: E1201 15:01:46.706896 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:46Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.712523 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.712579 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.712590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.712609 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.712622 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: E1201 15:01:46.725563 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:46Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.730457 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.730499 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.730531 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.730555 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.730568 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: E1201 15:01:46.745888 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:46Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.751377 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.751488 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.751543 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.751571 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.751590 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: E1201 15:01:46.770642 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:46Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.775916 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.775964 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.775979 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.776002 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.776016 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: E1201 15:01:46.794670 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:46Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:46 crc kubenswrapper[4931]: E1201 15:01:46.794825 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.797010 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.797081 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.797097 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.797113 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.797124 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.902573 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.902988 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.903115 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.903316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:46 crc kubenswrapper[4931]: I1201 15:01:46.903513 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:46Z","lastTransitionTime":"2025-12-01T15:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.006810 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.007252 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.007266 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.007292 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.007304 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:47Z","lastTransitionTime":"2025-12-01T15:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.110599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.110652 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.110664 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.110683 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.110700 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:47Z","lastTransitionTime":"2025-12-01T15:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.214226 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.214287 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.214306 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.214334 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.214354 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:47Z","lastTransitionTime":"2025-12-01T15:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.240643 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:47 crc kubenswrapper[4931]: E1201 15:01:47.240815 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.317255 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.317346 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.317365 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.317422 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.317443 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:47Z","lastTransitionTime":"2025-12-01T15:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.421325 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.421454 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.421477 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.421511 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.421537 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:47Z","lastTransitionTime":"2025-12-01T15:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.524635 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.524709 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.524729 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.524757 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.524777 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:47Z","lastTransitionTime":"2025-12-01T15:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.628460 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.628541 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.628559 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.628584 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.628604 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:47Z","lastTransitionTime":"2025-12-01T15:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.732200 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.732277 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.732296 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.732322 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.732343 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:47Z","lastTransitionTime":"2025-12-01T15:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.836349 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.836424 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.836434 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.836454 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.836463 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:47Z","lastTransitionTime":"2025-12-01T15:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.940038 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.940108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.940128 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.940156 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:47 crc kubenswrapper[4931]: I1201 15:01:47.940174 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:47Z","lastTransitionTime":"2025-12-01T15:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.043879 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.043940 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.043961 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.043986 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.044004 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:48Z","lastTransitionTime":"2025-12-01T15:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.147602 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.147686 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.147709 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.147739 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.147805 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:48Z","lastTransitionTime":"2025-12-01T15:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.241449 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.241540 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.241487 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:48 crc kubenswrapper[4931]: E1201 15:01:48.241654 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:48 crc kubenswrapper[4931]: E1201 15:01:48.241796 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:48 crc kubenswrapper[4931]: E1201 15:01:48.242047 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.251523 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.251573 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.251590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.251611 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.251631 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:48Z","lastTransitionTime":"2025-12-01T15:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.354683 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.354738 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.354758 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.354779 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.354797 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:48Z","lastTransitionTime":"2025-12-01T15:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.458457 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.458538 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.458557 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.458586 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.458606 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:48Z","lastTransitionTime":"2025-12-01T15:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.563533 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.563602 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.563623 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.563650 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.563688 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:48Z","lastTransitionTime":"2025-12-01T15:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.667184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.667250 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.667269 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.667334 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.667358 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:48Z","lastTransitionTime":"2025-12-01T15:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.770498 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.770546 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.770577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.770597 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.770607 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:48Z","lastTransitionTime":"2025-12-01T15:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.873851 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.873884 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.873892 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.873907 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.873917 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:48Z","lastTransitionTime":"2025-12-01T15:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.976528 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.976580 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.976604 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.976636 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:48 crc kubenswrapper[4931]: I1201 15:01:48.976658 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:48Z","lastTransitionTime":"2025-12-01T15:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.079712 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.079783 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.079808 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.079840 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.079864 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:49Z","lastTransitionTime":"2025-12-01T15:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.182641 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.182687 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.182704 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.182726 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.182742 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:49Z","lastTransitionTime":"2025-12-01T15:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.241039 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:49 crc kubenswrapper[4931]: E1201 15:01:49.241259 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.285884 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.285948 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.285971 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.286002 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.286023 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:49Z","lastTransitionTime":"2025-12-01T15:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.387638 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.387676 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.387684 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.387698 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.387707 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:49Z","lastTransitionTime":"2025-12-01T15:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.491100 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.491170 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.491254 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.491328 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.491357 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:49Z","lastTransitionTime":"2025-12-01T15:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.601096 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.601195 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.601220 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.601249 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.601269 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:49Z","lastTransitionTime":"2025-12-01T15:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.704206 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.704257 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.704266 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.704284 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.704298 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:49Z","lastTransitionTime":"2025-12-01T15:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.807777 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.807836 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.807852 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.807875 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.807893 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:49Z","lastTransitionTime":"2025-12-01T15:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.910525 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.910601 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.910619 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.910645 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:49 crc kubenswrapper[4931]: I1201 15:01:49.910663 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:49Z","lastTransitionTime":"2025-12-01T15:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.013776 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.013875 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.013904 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.013941 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.013966 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:50Z","lastTransitionTime":"2025-12-01T15:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.116273 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.116322 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.116340 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.116362 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.116376 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:50Z","lastTransitionTime":"2025-12-01T15:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.219599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.219675 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.219702 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.219733 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.219754 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:50Z","lastTransitionTime":"2025-12-01T15:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.240666 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.240759 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.240769 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:50 crc kubenswrapper[4931]: E1201 15:01:50.240894 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:50 crc kubenswrapper[4931]: E1201 15:01:50.241035 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:50 crc kubenswrapper[4931]: E1201 15:01:50.241271 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.323693 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.323774 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.323796 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.323828 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.323866 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:50Z","lastTransitionTime":"2025-12-01T15:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.426152 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.426197 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.426209 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.426227 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.426239 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:50Z","lastTransitionTime":"2025-12-01T15:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.530253 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.530329 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.530348 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.530379 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.530426 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:50Z","lastTransitionTime":"2025-12-01T15:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.633494 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.633542 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.633560 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.633583 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.633601 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:50Z","lastTransitionTime":"2025-12-01T15:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.736978 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.737011 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.737019 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.737033 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.737044 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:50Z","lastTransitionTime":"2025-12-01T15:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.839589 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.839622 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.839631 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.839643 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.839651 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:50Z","lastTransitionTime":"2025-12-01T15:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.942318 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.942398 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.942413 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.942427 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:50 crc kubenswrapper[4931]: I1201 15:01:50.942438 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:50Z","lastTransitionTime":"2025-12-01T15:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.045248 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.045300 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.045372 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.045407 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.045419 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:51Z","lastTransitionTime":"2025-12-01T15:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.147651 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.147709 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.147719 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.147737 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.147749 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:51Z","lastTransitionTime":"2025-12-01T15:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.240956 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:51 crc kubenswrapper[4931]: E1201 15:01:51.241200 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.250812 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.250842 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.250851 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.250862 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.250872 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:51Z","lastTransitionTime":"2025-12-01T15:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.353668 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.353764 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.353776 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.353803 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.353814 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:51Z","lastTransitionTime":"2025-12-01T15:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.456488 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.456566 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.456586 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.456611 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.456630 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:51Z","lastTransitionTime":"2025-12-01T15:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.559754 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.559825 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.559844 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.559870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.559894 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:51Z","lastTransitionTime":"2025-12-01T15:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.663198 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.663279 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.663304 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.663343 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.663369 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:51Z","lastTransitionTime":"2025-12-01T15:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.766137 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.766195 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.766204 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.766220 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.766229 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:51Z","lastTransitionTime":"2025-12-01T15:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.868471 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.868520 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.868534 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.868554 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.868568 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:51Z","lastTransitionTime":"2025-12-01T15:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.972004 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.972097 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.972133 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.972171 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:51 crc kubenswrapper[4931]: I1201 15:01:51.972193 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:51Z","lastTransitionTime":"2025-12-01T15:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.075059 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.075147 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.075171 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.075205 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.075228 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:52Z","lastTransitionTime":"2025-12-01T15:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.177590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.177670 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.177710 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.177743 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.177765 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:52Z","lastTransitionTime":"2025-12-01T15:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.241282 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.241282 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:52 crc kubenswrapper[4931]: E1201 15:01:52.241431 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:52 crc kubenswrapper[4931]: E1201 15:01:52.241766 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.241964 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:52 crc kubenswrapper[4931]: E1201 15:01:52.242130 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.280957 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.281020 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.281037 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.281059 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.281076 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:52Z","lastTransitionTime":"2025-12-01T15:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.384163 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.384226 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.384244 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.384269 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.384288 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:52Z","lastTransitionTime":"2025-12-01T15:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.487014 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.487053 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.487063 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.487079 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.487088 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:52Z","lastTransitionTime":"2025-12-01T15:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.590130 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.590199 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.590219 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.590244 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.590262 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:52Z","lastTransitionTime":"2025-12-01T15:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.693740 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.693825 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.693839 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.693866 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.693885 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:52Z","lastTransitionTime":"2025-12-01T15:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.797679 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.797726 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.797738 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.797755 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.797767 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:52Z","lastTransitionTime":"2025-12-01T15:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.900938 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.901013 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.901036 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.901063 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:52 crc kubenswrapper[4931]: I1201 15:01:52.901081 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:52Z","lastTransitionTime":"2025-12-01T15:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.004350 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.004470 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.004502 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.004537 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.004564 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:53Z","lastTransitionTime":"2025-12-01T15:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.106829 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.106914 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.106945 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.106980 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.107004 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:53Z","lastTransitionTime":"2025-12-01T15:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.210516 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.210582 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.210599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.210623 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.210640 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:53Z","lastTransitionTime":"2025-12-01T15:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.240598 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:53 crc kubenswrapper[4931]: E1201 15:01:53.240792 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.313908 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.313958 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.313971 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.313996 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.314011 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:53Z","lastTransitionTime":"2025-12-01T15:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.416809 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.416880 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.416901 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.416927 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.416946 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:53Z","lastTransitionTime":"2025-12-01T15:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.519454 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.519508 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.519518 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.519533 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.519543 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:53Z","lastTransitionTime":"2025-12-01T15:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.622086 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.622138 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.622148 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.622166 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.622176 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:53Z","lastTransitionTime":"2025-12-01T15:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.726063 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.726134 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.726156 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.726184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.726203 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:53Z","lastTransitionTime":"2025-12-01T15:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.829769 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.829835 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.829853 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.829879 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.829900 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:53Z","lastTransitionTime":"2025-12-01T15:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.932665 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.932702 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.932716 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.932731 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:53 crc kubenswrapper[4931]: I1201 15:01:53.932745 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:53Z","lastTransitionTime":"2025-12-01T15:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.036885 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.037366 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.037606 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.037810 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.037980 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:54Z","lastTransitionTime":"2025-12-01T15:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.141881 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.141956 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.141981 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.142015 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.142038 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:54Z","lastTransitionTime":"2025-12-01T15:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.240677 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.240700 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.240919 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:54 crc kubenswrapper[4931]: E1201 15:01:54.241129 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:54 crc kubenswrapper[4931]: E1201 15:01:54.241292 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:54 crc kubenswrapper[4931]: E1201 15:01:54.241535 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.246328 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.246427 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.246450 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.246482 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.246505 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:54Z","lastTransitionTime":"2025-12-01T15:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.264246 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.286392 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.303018 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.322973 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.348931 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.348979 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.348991 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.349011 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.349026 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:54Z","lastTransitionTime":"2025-12-01T15:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.365490 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.387860 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.409970 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.428203 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.446967 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.452642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.452683 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.452699 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.452720 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.452734 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:54Z","lastTransitionTime":"2025-12-01T15:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.464158 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.483334 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.504223 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.525464 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.544006 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.557842 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.558108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.558251 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.558379 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.558573 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:54Z","lastTransitionTime":"2025-12-01T15:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.561144 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.578876 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.610741 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:42Z\\\",\\\"message\\\":\\\"obj_retry.go:551] Creating *factory.egressNode crc took: 10.993708ms\\\\nI1201 15:01:42.136821 6600 factory.go:1336] Added *v1.Node event handler 7\\\\nI1201 15:01:42.136888 6600 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 15:01:42.136907 6600 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 15:01:42.136950 6600 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 15:01:42.136961 6600 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 15:01:42.136982 6600 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1201 15:01:42.136984 6600 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 15:01:42.137016 6600 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 15:01:42.137009 6600 factory.go:656] Stopping watch factory\\\\nI1201 15:01:42.137027 6600 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 15:01:42.137031 6600 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 15:01:42.137436 6600 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 15:01:42.137572 6600 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 15:01:42.137648 6600 ovnkube.go:599] Stopped ovnkube\\\\nI1201 15:01:42.137691 6600 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 15:01:42.137785 6600 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.631663 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:54Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.661770 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.661831 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.661857 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.661896 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.661921 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:54Z","lastTransitionTime":"2025-12-01T15:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.765950 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.766078 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.766149 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.766180 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.766261 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:54Z","lastTransitionTime":"2025-12-01T15:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.870560 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.870621 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.870641 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.870671 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.870689 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:54Z","lastTransitionTime":"2025-12-01T15:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.974283 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.974374 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.974435 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.974473 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:54 crc kubenswrapper[4931]: I1201 15:01:54.974500 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:54Z","lastTransitionTime":"2025-12-01T15:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.077740 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.077830 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.077855 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.077883 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.077902 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:55Z","lastTransitionTime":"2025-12-01T15:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.180929 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.180983 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.180997 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.181018 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.181033 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:55Z","lastTransitionTime":"2025-12-01T15:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.241163 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:55 crc kubenswrapper[4931]: E1201 15:01:55.241436 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.285342 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.285442 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.285461 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.285486 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.285505 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:55Z","lastTransitionTime":"2025-12-01T15:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.388598 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.388674 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.388695 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.388725 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.388749 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:55Z","lastTransitionTime":"2025-12-01T15:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.491707 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.491771 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.491789 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.491817 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.491837 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:55Z","lastTransitionTime":"2025-12-01T15:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.595144 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.595188 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.595200 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.595218 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.595230 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:55Z","lastTransitionTime":"2025-12-01T15:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.698822 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.699236 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.699437 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.699619 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.699805 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:55Z","lastTransitionTime":"2025-12-01T15:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.802969 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.803005 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.803015 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.803032 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.803044 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:55Z","lastTransitionTime":"2025-12-01T15:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.906909 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.906975 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.906993 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.907020 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:55 crc kubenswrapper[4931]: I1201 15:01:55.907039 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:55Z","lastTransitionTime":"2025-12-01T15:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.010310 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.010426 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.010453 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.010482 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.010500 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:56Z","lastTransitionTime":"2025-12-01T15:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.114887 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.115137 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.115155 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.115182 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.115200 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:56Z","lastTransitionTime":"2025-12-01T15:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.219189 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.219246 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.219265 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.219291 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.219311 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:56Z","lastTransitionTime":"2025-12-01T15:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.241342 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:56 crc kubenswrapper[4931]: E1201 15:01:56.241631 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.242769 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:56 crc kubenswrapper[4931]: E1201 15:01:56.243083 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.243270 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:56 crc kubenswrapper[4931]: E1201 15:01:56.243428 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.322294 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.322351 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.322367 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.322391 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.322429 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:56Z","lastTransitionTime":"2025-12-01T15:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.426012 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.426079 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.426096 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.426121 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.426141 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:56Z","lastTransitionTime":"2025-12-01T15:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.530078 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.530140 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.530158 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.530183 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.530203 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:56Z","lastTransitionTime":"2025-12-01T15:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.633775 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.633825 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.633841 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.633863 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.633879 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:56Z","lastTransitionTime":"2025-12-01T15:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.738021 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.738071 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.738090 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.738113 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.738131 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:56Z","lastTransitionTime":"2025-12-01T15:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.840720 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.840758 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.840769 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.840785 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.840796 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:56Z","lastTransitionTime":"2025-12-01T15:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.945144 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.945226 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.945249 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.945278 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:56 crc kubenswrapper[4931]: I1201 15:01:56.945300 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:56Z","lastTransitionTime":"2025-12-01T15:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.049509 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.049582 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.049600 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.049630 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.049652 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.051690 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.051759 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.051784 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.051813 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.051836 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: E1201 15:01:57.074375 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:57Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.081223 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.081303 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.081320 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.081345 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.081421 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: E1201 15:01:57.097667 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:57Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.103002 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.103049 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.103088 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.103108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.103121 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: E1201 15:01:57.121333 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:57Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.128259 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.128311 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.128330 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.128364 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.128423 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: E1201 15:01:57.153660 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:57Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.161127 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.161223 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.161243 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.161270 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.161288 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: E1201 15:01:57.179988 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:01:57Z is after 2025-08-24T17:21:41Z" Dec 01 15:01:57 crc kubenswrapper[4931]: E1201 15:01:57.180203 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.182425 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.182481 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.182494 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.182515 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.182528 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.240511 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:57 crc kubenswrapper[4931]: E1201 15:01:57.240730 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.286460 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.286577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.286596 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.286621 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.286657 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.389029 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.389076 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.389093 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.389115 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.389131 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.492445 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.492874 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.493003 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.493164 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.493300 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.597010 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.597079 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.597092 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.597108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.597121 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.699688 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.699740 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.699754 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.699773 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.699786 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.802985 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.803497 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.803508 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.803531 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.803543 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.907035 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.907091 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.907112 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.907138 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:57 crc kubenswrapper[4931]: I1201 15:01:57.907157 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:57Z","lastTransitionTime":"2025-12-01T15:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.009362 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.009489 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.009521 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.009560 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.009584 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:58Z","lastTransitionTime":"2025-12-01T15:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.112274 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.112332 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.112349 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.112374 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.112418 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:58Z","lastTransitionTime":"2025-12-01T15:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.215340 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.215727 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.215801 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.215875 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.215961 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:58Z","lastTransitionTime":"2025-12-01T15:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.241233 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.241317 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.241355 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:01:58 crc kubenswrapper[4931]: E1201 15:01:58.241596 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:01:58 crc kubenswrapper[4931]: E1201 15:01:58.241714 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:01:58 crc kubenswrapper[4931]: E1201 15:01:58.241918 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.243177 4931 scope.go:117] "RemoveContainer" containerID="b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9" Dec 01 15:01:58 crc kubenswrapper[4931]: E1201 15:01:58.244145 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.318886 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.318943 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.318956 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.318976 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.318988 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:58Z","lastTransitionTime":"2025-12-01T15:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.422104 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.422173 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.422192 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.422223 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.422246 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:58Z","lastTransitionTime":"2025-12-01T15:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.525430 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.525501 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.525520 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.525547 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.525570 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:58Z","lastTransitionTime":"2025-12-01T15:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.628283 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.628325 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.628335 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.628351 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.628362 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:58Z","lastTransitionTime":"2025-12-01T15:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.760423 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.761493 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.761506 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.761521 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.761532 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:58Z","lastTransitionTime":"2025-12-01T15:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.863791 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.864070 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.864135 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.864197 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.864253 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:58Z","lastTransitionTime":"2025-12-01T15:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.967156 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.967208 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.967220 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.967245 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:58 crc kubenswrapper[4931]: I1201 15:01:58.967258 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:58Z","lastTransitionTime":"2025-12-01T15:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.069789 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.069847 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.069858 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.069877 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.069891 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:59Z","lastTransitionTime":"2025-12-01T15:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.173172 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.173371 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.173541 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.173709 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.173847 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:59Z","lastTransitionTime":"2025-12-01T15:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.240828 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:01:59 crc kubenswrapper[4931]: E1201 15:01:59.241065 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.277770 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.277819 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.277837 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.277860 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.277880 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:59Z","lastTransitionTime":"2025-12-01T15:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.381838 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.382319 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.382598 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.382884 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.383135 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:59Z","lastTransitionTime":"2025-12-01T15:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.486847 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.487238 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.487459 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.487594 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.487769 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:59Z","lastTransitionTime":"2025-12-01T15:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.591284 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.591362 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.591381 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.591449 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.591479 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:59Z","lastTransitionTime":"2025-12-01T15:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.695436 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.695495 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.695507 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.695528 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.695543 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:59Z","lastTransitionTime":"2025-12-01T15:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.799202 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.799283 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.799303 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.799332 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.799352 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:59Z","lastTransitionTime":"2025-12-01T15:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.901937 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.901993 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.902005 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.902024 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.902040 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:01:59Z","lastTransitionTime":"2025-12-01T15:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:01:59 crc kubenswrapper[4931]: I1201 15:01:59.944709 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:01:59 crc kubenswrapper[4931]: E1201 15:01:59.944980 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:01:59 crc kubenswrapper[4931]: E1201 15:01:59.945103 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs podName:2e105961-27de-4865-bd7b-44dd04d12034 nodeName:}" failed. No retries permitted until 2025-12-01 15:02:31.945073466 +0000 UTC m=+98.370947163 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs") pod "network-metrics-daemon-78dk9" (UID: "2e105961-27de-4865-bd7b-44dd04d12034") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.004627 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.004694 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.004712 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.004747 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.004770 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:00Z","lastTransitionTime":"2025-12-01T15:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.108786 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.108845 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.108862 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.108892 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.108912 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:00Z","lastTransitionTime":"2025-12-01T15:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.218241 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.219308 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.219713 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.219933 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.220088 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:00Z","lastTransitionTime":"2025-12-01T15:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.241103 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.241165 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.241102 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:00 crc kubenswrapper[4931]: E1201 15:02:00.241356 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:00 crc kubenswrapper[4931]: E1201 15:02:00.241607 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:00 crc kubenswrapper[4931]: E1201 15:02:00.241758 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.324063 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.324129 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.324150 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.324176 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.324196 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:00Z","lastTransitionTime":"2025-12-01T15:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.427450 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.427492 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.427502 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.427517 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.427527 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:00Z","lastTransitionTime":"2025-12-01T15:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.530693 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.530748 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.530759 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.530774 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.530784 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:00Z","lastTransitionTime":"2025-12-01T15:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.632988 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.633036 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.633047 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.633064 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.633076 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:00Z","lastTransitionTime":"2025-12-01T15:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.734804 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.734847 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.734858 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.734876 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.734890 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:00Z","lastTransitionTime":"2025-12-01T15:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.836976 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.837089 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.837105 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.837122 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.837131 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:00Z","lastTransitionTime":"2025-12-01T15:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.939465 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.939499 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.939512 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.939527 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:00 crc kubenswrapper[4931]: I1201 15:02:00.939541 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:00Z","lastTransitionTime":"2025-12-01T15:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.041874 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.041918 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.041931 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.041949 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.041962 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:01Z","lastTransitionTime":"2025-12-01T15:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.144246 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.144297 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.144310 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.144324 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.144336 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:01Z","lastTransitionTime":"2025-12-01T15:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.241447 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:01 crc kubenswrapper[4931]: E1201 15:02:01.241601 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.247499 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.247727 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.247758 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.247772 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.247781 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:01Z","lastTransitionTime":"2025-12-01T15:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.349757 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.349790 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.349825 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.349847 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.349858 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:01Z","lastTransitionTime":"2025-12-01T15:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.452685 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.452736 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.452794 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.452825 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.452839 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:01Z","lastTransitionTime":"2025-12-01T15:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.555826 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.555862 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.555870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.555885 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.555894 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:01Z","lastTransitionTime":"2025-12-01T15:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.646642 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6nwqj_db092a9c-f0f2-401d-82dd-b3af535585cc/kube-multus/0.log" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.646705 4931 generic.go:334] "Generic (PLEG): container finished" podID="db092a9c-f0f2-401d-82dd-b3af535585cc" containerID="a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8" exitCode=1 Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.646740 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6nwqj" event={"ID":"db092a9c-f0f2-401d-82dd-b3af535585cc","Type":"ContainerDied","Data":"a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8"} Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.647185 4931 scope.go:117] "RemoveContainer" containerID="a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.658288 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.658338 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.658349 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.658366 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.658376 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:01Z","lastTransitionTime":"2025-12-01T15:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.666788 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.679616 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.692805 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.706179 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.724379 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:42Z\\\",\\\"message\\\":\\\"obj_retry.go:551] Creating *factory.egressNode crc took: 10.993708ms\\\\nI1201 15:01:42.136821 6600 factory.go:1336] Added *v1.Node event handler 7\\\\nI1201 15:01:42.136888 6600 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 15:01:42.136907 6600 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 15:01:42.136950 6600 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 15:01:42.136961 6600 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 15:01:42.136982 6600 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1201 15:01:42.136984 6600 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 15:01:42.137016 6600 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 15:01:42.137009 6600 factory.go:656] Stopping watch factory\\\\nI1201 15:01:42.137027 6600 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 15:01:42.137031 6600 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 15:01:42.137436 6600 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 15:01:42.137572 6600 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 15:01:42.137648 6600 ovnkube.go:599] Stopped ovnkube\\\\nI1201 15:01:42.137691 6600 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 15:01:42.137785 6600 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.737637 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.751142 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.760785 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.760830 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.760842 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.760859 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.760872 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:01Z","lastTransitionTime":"2025-12-01T15:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.764851 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.780049 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.794037 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.807427 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.830071 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.849237 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.863778 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.863811 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.863821 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.863837 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.863847 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:01Z","lastTransitionTime":"2025-12-01T15:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.867740 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:02:01Z\\\",\\\"message\\\":\\\"2025-12-01T15:01:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda\\\\n2025-12-01T15:01:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda to /host/opt/cni/bin/\\\\n2025-12-01T15:01:16Z [verbose] multus-daemon started\\\\n2025-12-01T15:01:16Z [verbose] Readiness Indicator file check\\\\n2025-12-01T15:02:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.879704 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.890336 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.900659 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.913963 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:01Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.965987 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.966023 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.966034 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.966067 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:01 crc kubenswrapper[4931]: I1201 15:02:01.966076 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:01Z","lastTransitionTime":"2025-12-01T15:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.068412 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.068548 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.068733 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.068904 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.069067 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:02Z","lastTransitionTime":"2025-12-01T15:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.172053 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.172103 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.172115 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.172134 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.172144 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:02Z","lastTransitionTime":"2025-12-01T15:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.240587 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.240637 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:02 crc kubenswrapper[4931]: E1201 15:02:02.240683 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.240587 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:02 crc kubenswrapper[4931]: E1201 15:02:02.240823 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:02 crc kubenswrapper[4931]: E1201 15:02:02.240863 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.275009 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.275052 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.275064 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.275079 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.275091 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:02Z","lastTransitionTime":"2025-12-01T15:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.378312 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.378379 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.378449 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.378480 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.378506 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:02Z","lastTransitionTime":"2025-12-01T15:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.481612 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.481652 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.481664 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.481678 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.481690 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:02Z","lastTransitionTime":"2025-12-01T15:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.585003 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.585084 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.585108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.585141 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.585196 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:02Z","lastTransitionTime":"2025-12-01T15:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.651706 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6nwqj_db092a9c-f0f2-401d-82dd-b3af535585cc/kube-multus/0.log" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.651800 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6nwqj" event={"ID":"db092a9c-f0f2-401d-82dd-b3af535585cc","Type":"ContainerStarted","Data":"056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991"} Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.674234 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:02:01Z\\\",\\\"message\\\":\\\"2025-12-01T15:01:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda\\\\n2025-12-01T15:01:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda to /host/opt/cni/bin/\\\\n2025-12-01T15:01:16Z [verbose] multus-daemon started\\\\n2025-12-01T15:01:16Z [verbose] Readiness Indicator file check\\\\n2025-12-01T15:02:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.687231 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.687278 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.687291 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.687306 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.687317 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:02Z","lastTransitionTime":"2025-12-01T15:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.689470 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.703001 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.713575 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.729078 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.762444 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.781207 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.789823 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.789859 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.789870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.789886 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.789896 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:02Z","lastTransitionTime":"2025-12-01T15:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.795957 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.809566 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.835714 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:42Z\\\",\\\"message\\\":\\\"obj_retry.go:551] Creating *factory.egressNode crc took: 10.993708ms\\\\nI1201 15:01:42.136821 6600 factory.go:1336] Added *v1.Node event handler 7\\\\nI1201 15:01:42.136888 6600 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 15:01:42.136907 6600 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 15:01:42.136950 6600 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 15:01:42.136961 6600 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 15:01:42.136982 6600 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1201 15:01:42.136984 6600 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 15:01:42.137016 6600 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 15:01:42.137009 6600 factory.go:656] Stopping watch factory\\\\nI1201 15:01:42.137027 6600 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 15:01:42.137031 6600 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 15:01:42.137436 6600 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 15:01:42.137572 6600 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 15:01:42.137648 6600 ovnkube.go:599] Stopped ovnkube\\\\nI1201 15:01:42.137691 6600 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 15:01:42.137785 6600 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.849502 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.865128 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.878237 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.892683 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.892731 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.892742 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.892758 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.892770 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:02Z","lastTransitionTime":"2025-12-01T15:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.896669 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.908961 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.925542 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.941853 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.955503 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:02Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.995055 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.995100 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.995114 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.995136 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:02 crc kubenswrapper[4931]: I1201 15:02:02.995150 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:02Z","lastTransitionTime":"2025-12-01T15:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.097279 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.097323 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.097336 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.097353 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.097365 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:03Z","lastTransitionTime":"2025-12-01T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.199123 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.199173 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.199184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.199204 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.199217 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:03Z","lastTransitionTime":"2025-12-01T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.240667 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:03 crc kubenswrapper[4931]: E1201 15:02:03.240790 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.301925 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.301997 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.302027 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.302057 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.302081 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:03Z","lastTransitionTime":"2025-12-01T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.404164 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.404224 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.404236 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.404252 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.404263 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:03Z","lastTransitionTime":"2025-12-01T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.506884 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.506938 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.506948 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.506964 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.506975 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:03Z","lastTransitionTime":"2025-12-01T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.609853 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.609913 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.609926 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.609946 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.609957 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:03Z","lastTransitionTime":"2025-12-01T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.713456 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.713536 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.713557 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.713586 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.713608 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:03Z","lastTransitionTime":"2025-12-01T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.816675 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.816724 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.816739 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.816757 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.816772 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:03Z","lastTransitionTime":"2025-12-01T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.919011 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.919054 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.919064 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.919080 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:03 crc kubenswrapper[4931]: I1201 15:02:03.919091 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:03Z","lastTransitionTime":"2025-12-01T15:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.021884 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.021947 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.021961 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.021983 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.021997 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:04Z","lastTransitionTime":"2025-12-01T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.126058 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.126124 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.126141 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.126170 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.126190 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:04Z","lastTransitionTime":"2025-12-01T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.230262 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.230324 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.230344 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.230370 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.230431 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:04Z","lastTransitionTime":"2025-12-01T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.240688 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.240768 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.240698 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:04 crc kubenswrapper[4931]: E1201 15:02:04.240867 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:04 crc kubenswrapper[4931]: E1201 15:02:04.240992 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:04 crc kubenswrapper[4931]: E1201 15:02:04.241069 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.262292 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.280096 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.302368 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.319346 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.332910 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.332956 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.332966 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.332983 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.333021 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:04Z","lastTransitionTime":"2025-12-01T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.333274 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:02:01Z\\\",\\\"message\\\":\\\"2025-12-01T15:01:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda\\\\n2025-12-01T15:01:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda to /host/opt/cni/bin/\\\\n2025-12-01T15:01:16Z [verbose] multus-daemon started\\\\n2025-12-01T15:01:16Z [verbose] Readiness Indicator file check\\\\n2025-12-01T15:02:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.346482 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.363985 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.378463 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.397047 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.416232 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.431586 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.435516 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.435560 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.435575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.435593 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.435605 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:04Z","lastTransitionTime":"2025-12-01T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.446559 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.457627 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.468538 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.490214 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:42Z\\\",\\\"message\\\":\\\"obj_retry.go:551] Creating *factory.egressNode crc took: 10.993708ms\\\\nI1201 15:01:42.136821 6600 factory.go:1336] Added *v1.Node event handler 7\\\\nI1201 15:01:42.136888 6600 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 15:01:42.136907 6600 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 15:01:42.136950 6600 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 15:01:42.136961 6600 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 15:01:42.136982 6600 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1201 15:01:42.136984 6600 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 15:01:42.137016 6600 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 15:01:42.137009 6600 factory.go:656] Stopping watch factory\\\\nI1201 15:01:42.137027 6600 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 15:01:42.137031 6600 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 15:01:42.137436 6600 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 15:01:42.137572 6600 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 15:01:42.137648 6600 ovnkube.go:599] Stopped ovnkube\\\\nI1201 15:01:42.137691 6600 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 15:01:42.137785 6600 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.508136 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.525315 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.539336 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.539633 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.539723 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.539801 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.539865 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:04Z","lastTransitionTime":"2025-12-01T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.541592 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:04Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.642719 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.642759 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.642789 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.642806 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.642816 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:04Z","lastTransitionTime":"2025-12-01T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.746355 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.746451 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.746462 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.746480 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.746491 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:04Z","lastTransitionTime":"2025-12-01T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.849326 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.849376 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.849412 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.849432 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.849450 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:04Z","lastTransitionTime":"2025-12-01T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.952051 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.952089 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.952101 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.952117 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:04 crc kubenswrapper[4931]: I1201 15:02:04.952128 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:04Z","lastTransitionTime":"2025-12-01T15:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.054909 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.054967 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.054986 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.055012 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.055031 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:05Z","lastTransitionTime":"2025-12-01T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.157821 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.158508 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.158598 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.158693 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.158785 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:05Z","lastTransitionTime":"2025-12-01T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.241320 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:05 crc kubenswrapper[4931]: E1201 15:02:05.241562 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.261528 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.261773 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.261836 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.261898 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.261964 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:05Z","lastTransitionTime":"2025-12-01T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.364120 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.364163 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.364179 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.364198 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.364211 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:05Z","lastTransitionTime":"2025-12-01T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.467666 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.467984 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.468088 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.468150 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.468226 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:05Z","lastTransitionTime":"2025-12-01T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.571099 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.571143 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.571153 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.571171 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.571185 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:05Z","lastTransitionTime":"2025-12-01T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.673494 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.673546 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.673558 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.673577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.673591 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:05Z","lastTransitionTime":"2025-12-01T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.776820 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.776875 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.776891 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.776913 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.776933 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:05Z","lastTransitionTime":"2025-12-01T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.879596 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.879660 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.879671 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.879687 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.879700 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:05Z","lastTransitionTime":"2025-12-01T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.982743 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.982796 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.982808 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.982823 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:05 crc kubenswrapper[4931]: I1201 15:02:05.982834 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:05Z","lastTransitionTime":"2025-12-01T15:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.085940 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.085996 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.086013 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.086034 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.086048 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:06Z","lastTransitionTime":"2025-12-01T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.188921 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.188988 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.189006 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.189032 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.189051 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:06Z","lastTransitionTime":"2025-12-01T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.240795 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.240865 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.241205 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:06 crc kubenswrapper[4931]: E1201 15:02:06.241311 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:06 crc kubenswrapper[4931]: E1201 15:02:06.241523 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:06 crc kubenswrapper[4931]: E1201 15:02:06.241701 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.290739 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.290780 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.290792 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.290808 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.290843 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:06Z","lastTransitionTime":"2025-12-01T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.393902 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.393954 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.393963 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.393979 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.393988 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:06Z","lastTransitionTime":"2025-12-01T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.497593 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.497912 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.498029 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.498204 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.498339 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:06Z","lastTransitionTime":"2025-12-01T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.601518 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.601566 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.601575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.601591 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.601601 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:06Z","lastTransitionTime":"2025-12-01T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.704326 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.704375 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.704413 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.704430 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.704440 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:06Z","lastTransitionTime":"2025-12-01T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.808057 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.808522 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.808713 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.808894 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.809090 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:06Z","lastTransitionTime":"2025-12-01T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.911565 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.911634 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.911651 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.911677 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:06 crc kubenswrapper[4931]: I1201 15:02:06.911695 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:06Z","lastTransitionTime":"2025-12-01T15:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.014811 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.015493 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.015537 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.015557 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.015570 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.118106 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.118159 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.118173 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.118189 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.118199 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.220704 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.220751 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.220762 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.220778 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.220809 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.240813 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:07 crc kubenswrapper[4931]: E1201 15:02:07.240998 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.323017 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.323278 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.323350 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.323437 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.323502 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.426856 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.426915 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.426935 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.426958 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.426978 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.529738 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.529794 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.529813 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.529835 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.529851 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.536169 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.536230 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.536249 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.536273 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.536289 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: E1201 15:02:07.555242 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:07Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.559164 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.559216 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.559234 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.559258 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.559274 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: E1201 15:02:07.582764 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:07Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.588236 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.588323 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.588380 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.588471 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.588534 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: E1201 15:02:07.602797 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:07Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.606144 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.606248 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.606308 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.606368 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.606460 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: E1201 15:02:07.616528 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:07Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.619965 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.620077 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.620141 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.620208 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.620276 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: E1201 15:02:07.637002 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:07Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:07 crc kubenswrapper[4931]: E1201 15:02:07.637113 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.638870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.638972 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.639056 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.639131 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.639219 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.742020 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.742079 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.742097 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.742125 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.742142 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.845603 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.845702 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.845722 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.845747 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.845765 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.948584 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.948648 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.948669 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.948692 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:07 crc kubenswrapper[4931]: I1201 15:02:07.948710 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:07Z","lastTransitionTime":"2025-12-01T15:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.052039 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.052113 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.052135 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.052163 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.052185 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:08Z","lastTransitionTime":"2025-12-01T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.155955 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.156055 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.156076 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.156143 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.156163 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:08Z","lastTransitionTime":"2025-12-01T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.241473 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.241537 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.241516 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:08 crc kubenswrapper[4931]: E1201 15:02:08.241615 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:08 crc kubenswrapper[4931]: E1201 15:02:08.241684 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:08 crc kubenswrapper[4931]: E1201 15:02:08.241739 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.258693 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.258749 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.258768 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.258796 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.258813 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:08Z","lastTransitionTime":"2025-12-01T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.362282 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.362347 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.362367 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.362418 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.362438 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:08Z","lastTransitionTime":"2025-12-01T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.465535 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.465883 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.465964 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.466047 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.466121 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:08Z","lastTransitionTime":"2025-12-01T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.569304 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.569370 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.569420 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.569448 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.569466 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:08Z","lastTransitionTime":"2025-12-01T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.671993 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.672048 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.672065 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.672119 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.672137 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:08Z","lastTransitionTime":"2025-12-01T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.774541 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.774599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.774616 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.774641 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.774662 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:08Z","lastTransitionTime":"2025-12-01T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.878504 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.878763 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.878881 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.878985 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.879064 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:08Z","lastTransitionTime":"2025-12-01T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.981976 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.982036 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.982054 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.982080 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:08 crc kubenswrapper[4931]: I1201 15:02:08.982098 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:08Z","lastTransitionTime":"2025-12-01T15:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.086125 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.087291 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.087538 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.087694 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.087842 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:09Z","lastTransitionTime":"2025-12-01T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.191420 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.191490 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.191509 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.191733 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.191753 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:09Z","lastTransitionTime":"2025-12-01T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.240856 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:09 crc kubenswrapper[4931]: E1201 15:02:09.241066 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.295142 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.295196 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.295216 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.295239 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.295256 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:09Z","lastTransitionTime":"2025-12-01T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.398523 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.398585 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.398602 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.398625 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.398642 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:09Z","lastTransitionTime":"2025-12-01T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.501341 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.501449 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.501468 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.501496 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.501516 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:09Z","lastTransitionTime":"2025-12-01T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.604749 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.605152 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.605357 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.605596 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.605774 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:09Z","lastTransitionTime":"2025-12-01T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.708418 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.708509 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.708534 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.708585 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.708609 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:09Z","lastTransitionTime":"2025-12-01T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.811806 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.811857 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.811866 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.811881 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.811893 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:09Z","lastTransitionTime":"2025-12-01T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.914679 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.914755 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.914778 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.914805 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:09 crc kubenswrapper[4931]: I1201 15:02:09.914825 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:09Z","lastTransitionTime":"2025-12-01T15:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.017817 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.017905 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.017920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.017974 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.017992 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:10Z","lastTransitionTime":"2025-12-01T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.121609 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.121675 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.121694 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.121722 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.121742 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:10Z","lastTransitionTime":"2025-12-01T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.225567 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.225630 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.225648 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.225675 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.225695 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:10Z","lastTransitionTime":"2025-12-01T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.241102 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:10 crc kubenswrapper[4931]: E1201 15:02:10.241256 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.241285 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.241854 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:10 crc kubenswrapper[4931]: E1201 15:02:10.242044 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:10 crc kubenswrapper[4931]: E1201 15:02:10.242252 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.242429 4931 scope.go:117] "RemoveContainer" containerID="b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.328172 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.328448 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.328632 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.328767 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.328884 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:10Z","lastTransitionTime":"2025-12-01T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.432974 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.433318 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.433554 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.433763 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.433966 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:10Z","lastTransitionTime":"2025-12-01T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.537703 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.537785 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.537816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.537850 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.537873 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:10Z","lastTransitionTime":"2025-12-01T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.641269 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.641310 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.641322 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.641339 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.641350 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:10Z","lastTransitionTime":"2025-12-01T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.687052 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/2.log" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.691998 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerStarted","Data":"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351"} Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.693970 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.715502 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.732322 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.745567 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.745656 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.745678 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.745712 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.745732 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:10Z","lastTransitionTime":"2025-12-01T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.761525 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.778479 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.795448 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:02:01Z\\\",\\\"message\\\":\\\"2025-12-01T15:01:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda\\\\n2025-12-01T15:01:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda to /host/opt/cni/bin/\\\\n2025-12-01T15:01:16Z [verbose] multus-daemon started\\\\n2025-12-01T15:01:16Z [verbose] Readiness Indicator file check\\\\n2025-12-01T15:02:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.814980 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.835225 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.849914 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.849981 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.850054 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.850085 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.850109 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:10Z","lastTransitionTime":"2025-12-01T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.852545 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.867597 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.903135 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.928945 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.947727 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.953061 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.953095 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.953109 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.953128 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.953141 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:10Z","lastTransitionTime":"2025-12-01T15:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.958706 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.968020 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:10 crc kubenswrapper[4931]: I1201 15:02:10.987763 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:42Z\\\",\\\"message\\\":\\\"obj_retry.go:551] Creating *factory.egressNode crc took: 10.993708ms\\\\nI1201 15:01:42.136821 6600 factory.go:1336] Added *v1.Node event handler 7\\\\nI1201 15:01:42.136888 6600 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 15:01:42.136907 6600 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 15:01:42.136950 6600 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 15:01:42.136961 6600 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 15:01:42.136982 6600 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1201 15:01:42.136984 6600 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 15:01:42.137016 6600 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 15:01:42.137009 6600 factory.go:656] Stopping watch factory\\\\nI1201 15:01:42.137027 6600 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 15:01:42.137031 6600 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 15:01:42.137436 6600 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 15:01:42.137572 6600 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 15:01:42.137648 6600 ovnkube.go:599] Stopped ovnkube\\\\nI1201 15:01:42.137691 6600 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 15:01:42.137785 6600 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:10.999924 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:10Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.020205 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.034443 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.055156 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.055194 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.055204 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.055218 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.055227 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:11Z","lastTransitionTime":"2025-12-01T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.157643 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.157681 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.157693 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.157706 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.157718 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:11Z","lastTransitionTime":"2025-12-01T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.241405 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:11 crc kubenswrapper[4931]: E1201 15:02:11.241576 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.260348 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.260415 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.260428 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.260447 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.260462 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:11Z","lastTransitionTime":"2025-12-01T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.363754 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.363824 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.363850 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.363883 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.363910 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:11Z","lastTransitionTime":"2025-12-01T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.466515 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.466561 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.466570 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.466589 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.466600 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:11Z","lastTransitionTime":"2025-12-01T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.568885 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.568946 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.568960 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.568980 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.568999 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:11Z","lastTransitionTime":"2025-12-01T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.671234 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.671291 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.671309 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.671335 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.671352 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:11Z","lastTransitionTime":"2025-12-01T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.698273 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/3.log" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.699622 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/2.log" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.703279 4931 generic.go:334] "Generic (PLEG): container finished" podID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerID="4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351" exitCode=1 Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.703343 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351"} Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.703424 4931 scope.go:117] "RemoveContainer" containerID="b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.704336 4931 scope.go:117] "RemoveContainer" containerID="4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351" Dec 01 15:02:11 crc kubenswrapper[4931]: E1201 15:02:11.704676 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.726189 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.740342 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.755486 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:02:01Z\\\",\\\"message\\\":\\\"2025-12-01T15:01:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda\\\\n2025-12-01T15:01:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda to /host/opt/cni/bin/\\\\n2025-12-01T15:01:16Z [verbose] multus-daemon started\\\\n2025-12-01T15:01:16Z [verbose] Readiness Indicator file check\\\\n2025-12-01T15:02:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.771407 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.773740 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.773783 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.773799 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.773820 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.773835 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:11Z","lastTransitionTime":"2025-12-01T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.785454 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.797924 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.810431 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.823241 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.835885 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.864750 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.876476 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.876524 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.876535 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.876552 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.876563 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:11Z","lastTransitionTime":"2025-12-01T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.890760 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.910372 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.931253 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1d0f89b2db5b87352842590b9a2bb4e02666c6cc4d487b5551518ff3189fee9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:01:42Z\\\",\\\"message\\\":\\\"obj_retry.go:551] Creating *factory.egressNode crc took: 10.993708ms\\\\nI1201 15:01:42.136821 6600 factory.go:1336] Added *v1.Node event handler 7\\\\nI1201 15:01:42.136888 6600 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 15:01:42.136907 6600 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 15:01:42.136950 6600 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1201 15:01:42.136961 6600 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1201 15:01:42.136982 6600 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1201 15:01:42.136984 6600 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 15:01:42.137016 6600 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1201 15:01:42.137009 6600 factory.go:656] Stopping watch factory\\\\nI1201 15:01:42.137027 6600 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 15:01:42.137031 6600 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1201 15:01:42.137436 6600 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 15:01:42.137572 6600 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 15:01:42.137648 6600 ovnkube.go:599] Stopped ovnkube\\\\nI1201 15:01:42.137691 6600 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 15:01:42.137785 6600 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:02:11Z\\\",\\\"message\\\":\\\"ork=default\\\\nI1201 15:02:11.330158 6965 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 15:02:11.330182 6965 services_controller.go:360] Finished syncing service cluster-monitoring-operator on namespace openshift-monitoring for network=default : 18.14µs\\\\nI1201 15:02:11.330202 6965 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-source for network=default\\\\nI1201 15:02:11.330237 6965 services_controller.go:360] Finished syncing service network-check-source on namespace openshift-network-diagnostics for network=default : 25.391µs\\\\nF1201 15:02:11.330248 6965 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:02:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.945784 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.962199 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.975049 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.978871 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.978910 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.978927 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.978945 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.978959 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:11Z","lastTransitionTime":"2025-12-01T15:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:11 crc kubenswrapper[4931]: I1201 15:02:11.986504 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:11Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.003842 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.081760 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.081823 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.081844 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.081870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.081888 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:12Z","lastTransitionTime":"2025-12-01T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.185435 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.185491 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.185504 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.185526 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.185541 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:12Z","lastTransitionTime":"2025-12-01T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.240887 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.240958 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.240892 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:12 crc kubenswrapper[4931]: E1201 15:02:12.241098 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:12 crc kubenswrapper[4931]: E1201 15:02:12.241249 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:12 crc kubenswrapper[4931]: E1201 15:02:12.242054 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.289276 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.289326 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.289337 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.289352 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.289362 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:12Z","lastTransitionTime":"2025-12-01T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.391882 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.391937 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.391954 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.391979 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.391998 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:12Z","lastTransitionTime":"2025-12-01T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.495289 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.495345 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.495410 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.495432 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.495444 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:12Z","lastTransitionTime":"2025-12-01T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.599577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.599656 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.599682 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.599715 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.599750 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:12Z","lastTransitionTime":"2025-12-01T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.702617 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.702686 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.702709 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.702742 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.702766 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:12Z","lastTransitionTime":"2025-12-01T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.710661 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/3.log" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.715268 4931 scope.go:117] "RemoveContainer" containerID="4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351" Dec 01 15:02:12 crc kubenswrapper[4931]: E1201 15:02:12.715542 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.735276 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.751231 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.777604 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.794354 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.805501 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.805566 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.805589 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.805620 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.805643 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:12Z","lastTransitionTime":"2025-12-01T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.810138 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:02:01Z\\\",\\\"message\\\":\\\"2025-12-01T15:01:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda\\\\n2025-12-01T15:01:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda to /host/opt/cni/bin/\\\\n2025-12-01T15:01:16Z [verbose] multus-daemon started\\\\n2025-12-01T15:01:16Z [verbose] Readiness Indicator file check\\\\n2025-12-01T15:02:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.820558 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.833427 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.842739 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.854603 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.869253 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.881415 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.893368 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.902261 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.908557 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.908602 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.908615 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.908632 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.908642 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:12Z","lastTransitionTime":"2025-12-01T15:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.926294 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:02:11Z\\\",\\\"message\\\":\\\"ork=default\\\\nI1201 15:02:11.330158 6965 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 15:02:11.330182 6965 services_controller.go:360] Finished syncing service cluster-monitoring-operator on namespace openshift-monitoring for network=default : 18.14µs\\\\nI1201 15:02:11.330202 6965 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-source for network=default\\\\nI1201 15:02:11.330237 6965 services_controller.go:360] Finished syncing service network-check-source on namespace openshift-network-diagnostics for network=default : 25.391µs\\\\nF1201 15:02:11.330248 6965 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:02:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.937909 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.950613 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.961709 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:12 crc kubenswrapper[4931]: I1201 15:02:12.974722 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:12Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.010771 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.010806 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.010816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.010832 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.010844 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:13Z","lastTransitionTime":"2025-12-01T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.114085 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.114133 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.114149 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.114176 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.114194 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:13Z","lastTransitionTime":"2025-12-01T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.217050 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.217096 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.217108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.217128 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.217242 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:13Z","lastTransitionTime":"2025-12-01T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.240422 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:13 crc kubenswrapper[4931]: E1201 15:02:13.240541 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.319418 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.319448 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.319457 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.319470 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.319481 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:13Z","lastTransitionTime":"2025-12-01T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.422483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.422526 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.422575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.422588 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.422607 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:13Z","lastTransitionTime":"2025-12-01T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.525763 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.525840 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.525858 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.525886 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.525906 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:13Z","lastTransitionTime":"2025-12-01T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.629433 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.629477 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.629487 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.629503 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.629514 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:13Z","lastTransitionTime":"2025-12-01T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.733019 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.733065 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.733079 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.733096 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.733110 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:13Z","lastTransitionTime":"2025-12-01T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.835211 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.835308 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.835326 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.835351 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.835367 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:13Z","lastTransitionTime":"2025-12-01T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.938992 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.939054 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.939071 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.939098 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:13 crc kubenswrapper[4931]: I1201 15:02:13.939117 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:13Z","lastTransitionTime":"2025-12-01T15:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.044305 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.044360 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.044370 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.044401 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.044588 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:14Z","lastTransitionTime":"2025-12-01T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.148175 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.148216 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.148226 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.148241 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.148251 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:14Z","lastTransitionTime":"2025-12-01T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.241158 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:14 crc kubenswrapper[4931]: E1201 15:02:14.241343 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.241185 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:14 crc kubenswrapper[4931]: E1201 15:02:14.241479 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.241185 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:14 crc kubenswrapper[4931]: E1201 15:02:14.241676 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.250356 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.250419 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.250432 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.250446 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.250457 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:14Z","lastTransitionTime":"2025-12-01T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.258290 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.280488 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.299210 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.313264 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.336292 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.352087 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.352127 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.352139 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.352156 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.352168 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:14Z","lastTransitionTime":"2025-12-01T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.352907 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.369712 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:02:01Z\\\",\\\"message\\\":\\\"2025-12-01T15:01:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda\\\\n2025-12-01T15:01:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda to /host/opt/cni/bin/\\\\n2025-12-01T15:01:16Z [verbose] multus-daemon started\\\\n2025-12-01T15:01:16Z [verbose] Readiness Indicator file check\\\\n2025-12-01T15:02:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.381881 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.393965 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.406243 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.418181 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.431022 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.444152 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.454489 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.454571 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.454592 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.454617 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.454634 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:14Z","lastTransitionTime":"2025-12-01T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.457113 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.469554 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.493116 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:02:11Z\\\",\\\"message\\\":\\\"ork=default\\\\nI1201 15:02:11.330158 6965 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 15:02:11.330182 6965 services_controller.go:360] Finished syncing service cluster-monitoring-operator on namespace openshift-monitoring for network=default : 18.14µs\\\\nI1201 15:02:11.330202 6965 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-source for network=default\\\\nI1201 15:02:11.330237 6965 services_controller.go:360] Finished syncing service network-check-source on namespace openshift-network-diagnostics for network=default : 25.391µs\\\\nF1201 15:02:11.330248 6965 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:02:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.510637 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.525487 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:14Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.558632 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.558690 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.558704 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.558723 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.558738 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:14Z","lastTransitionTime":"2025-12-01T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.661142 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.661200 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.661217 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.661242 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.661260 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:14Z","lastTransitionTime":"2025-12-01T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.775533 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.775599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.775617 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.775644 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.775662 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:14Z","lastTransitionTime":"2025-12-01T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.878472 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.878504 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.878513 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.878527 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.878538 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:14Z","lastTransitionTime":"2025-12-01T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.981480 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.981553 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.981565 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.981581 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:14 crc kubenswrapper[4931]: I1201 15:02:14.981593 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:14Z","lastTransitionTime":"2025-12-01T15:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.084447 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.084499 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.084511 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.084530 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.084543 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:15Z","lastTransitionTime":"2025-12-01T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.187346 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.187410 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.187420 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.187434 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.187444 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:15Z","lastTransitionTime":"2025-12-01T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.240567 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:15 crc kubenswrapper[4931]: E1201 15:02:15.241023 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.254224 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.289401 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.289442 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.289452 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.289467 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.289478 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:15Z","lastTransitionTime":"2025-12-01T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.392504 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.392545 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.392553 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.392567 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.392576 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:15Z","lastTransitionTime":"2025-12-01T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.495735 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.495831 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.495850 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.495876 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.495893 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:15Z","lastTransitionTime":"2025-12-01T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.599887 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.599936 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.599951 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.599971 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.599983 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:15Z","lastTransitionTime":"2025-12-01T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.704037 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.704114 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.704133 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.704158 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.704177 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:15Z","lastTransitionTime":"2025-12-01T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.807568 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.807636 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.807838 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.807865 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.807886 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:15Z","lastTransitionTime":"2025-12-01T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.912068 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.912139 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.912157 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.912183 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:15 crc kubenswrapper[4931]: I1201 15:02:15.912203 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:15Z","lastTransitionTime":"2025-12-01T15:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.014613 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.014675 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.014696 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.014719 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.014736 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:16Z","lastTransitionTime":"2025-12-01T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.027076 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.027228 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.027261 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.027285 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.027313 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.027381 4931 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.027433 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:20.027359239 +0000 UTC m=+146.453232936 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.027464 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.027484 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:03:20.027466282 +0000 UTC m=+146.453339989 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.027485 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.027490 4931 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.027521 4931 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.027591 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 15:03:20.027579955 +0000 UTC m=+146.453453652 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.027603 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.027654 4931 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.027615 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 15:03:20.027603266 +0000 UTC m=+146.453476973 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.027680 4931 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.027752 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 15:03:20.02772454 +0000 UTC m=+146.453598237 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.117921 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.117984 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.118004 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.118029 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.118048 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:16Z","lastTransitionTime":"2025-12-01T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.222309 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.222363 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.222372 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.222402 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.222416 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:16Z","lastTransitionTime":"2025-12-01T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.240719 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.240862 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.240973 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.241180 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.241525 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:16 crc kubenswrapper[4931]: E1201 15:02:16.241374 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.325577 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.325648 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.325668 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.325692 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.325709 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:16Z","lastTransitionTime":"2025-12-01T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.428962 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.429007 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.429018 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.429036 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.429048 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:16Z","lastTransitionTime":"2025-12-01T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.531962 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.532025 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.532044 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.532074 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.532095 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:16Z","lastTransitionTime":"2025-12-01T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.634808 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.634864 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.634885 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.634910 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.634928 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:16Z","lastTransitionTime":"2025-12-01T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.738001 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.738083 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.738108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.738141 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.738166 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:16Z","lastTransitionTime":"2025-12-01T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.840896 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.840956 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.840981 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.841008 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.841029 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:16Z","lastTransitionTime":"2025-12-01T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.944165 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.944203 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.944215 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.944234 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:16 crc kubenswrapper[4931]: I1201 15:02:16.944246 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:16Z","lastTransitionTime":"2025-12-01T15:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.047837 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.047909 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.047929 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.047958 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.047977 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.151355 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.151462 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.151481 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.151505 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.151521 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.240758 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:17 crc kubenswrapper[4931]: E1201 15:02:17.241111 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.256031 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.256092 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.256111 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.256151 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.256172 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.360065 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.360116 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.360128 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.360148 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.360161 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.463170 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.463246 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.463264 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.463292 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.463312 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.566518 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.566598 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.566617 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.566648 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.566667 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.670097 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.670171 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.670188 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.670212 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.670229 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.704099 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.704165 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.704190 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.704222 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.704245 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: E1201 15:02:17.726458 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.732026 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.732079 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.732096 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.732120 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.732138 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: E1201 15:02:17.757057 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.763207 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.763255 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.763276 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.763302 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.763319 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: E1201 15:02:17.784455 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.789590 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.789644 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.789661 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.789685 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.789702 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: E1201 15:02:17.809170 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.814612 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.814670 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.814690 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.814715 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.814734 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: E1201 15:02:17.836667 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:17Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:17 crc kubenswrapper[4931]: E1201 15:02:17.836858 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.839276 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.839309 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.839322 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.839338 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.839350 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.943379 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.943472 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.943490 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.943519 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:17 crc kubenswrapper[4931]: I1201 15:02:17.943539 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:17Z","lastTransitionTime":"2025-12-01T15:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.046372 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.046440 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.046455 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.046474 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.046488 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:18Z","lastTransitionTime":"2025-12-01T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.150475 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.150532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.150545 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.150562 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.150575 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:18Z","lastTransitionTime":"2025-12-01T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.240627 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.240720 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.240655 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:18 crc kubenswrapper[4931]: E1201 15:02:18.240865 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:18 crc kubenswrapper[4931]: E1201 15:02:18.241008 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:18 crc kubenswrapper[4931]: E1201 15:02:18.241140 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.253895 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.253981 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.253999 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.254023 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.254043 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:18Z","lastTransitionTime":"2025-12-01T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.357071 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.357129 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.357150 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.357172 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.357188 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:18Z","lastTransitionTime":"2025-12-01T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.460477 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.460538 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.460556 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.460580 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.460598 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:18Z","lastTransitionTime":"2025-12-01T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.564293 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.564374 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.564463 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.564508 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.564531 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:18Z","lastTransitionTime":"2025-12-01T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.668600 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.668693 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.668719 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.668753 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.668771 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:18Z","lastTransitionTime":"2025-12-01T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.772184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.772254 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.772272 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.772295 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.772313 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:18Z","lastTransitionTime":"2025-12-01T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.875184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.875256 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.875272 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.875299 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.875321 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:18Z","lastTransitionTime":"2025-12-01T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.978473 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.978530 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.978548 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.978571 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:18 crc kubenswrapper[4931]: I1201 15:02:18.978589 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:18Z","lastTransitionTime":"2025-12-01T15:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.081327 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.081416 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.081438 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.081464 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.081484 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:19Z","lastTransitionTime":"2025-12-01T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.185557 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.185611 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.185629 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.185655 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.185672 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:19Z","lastTransitionTime":"2025-12-01T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.240815 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:19 crc kubenswrapper[4931]: E1201 15:02:19.240976 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.288523 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.288581 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.288598 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.288622 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.288641 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:19Z","lastTransitionTime":"2025-12-01T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.392745 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.393521 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.393548 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.393580 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.393607 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:19Z","lastTransitionTime":"2025-12-01T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.497001 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.497035 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.497043 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.497056 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.497065 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:19Z","lastTransitionTime":"2025-12-01T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.600048 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.600091 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.600100 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.600117 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.600128 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:19Z","lastTransitionTime":"2025-12-01T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.703744 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.703796 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.703814 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.703840 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.703858 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:19Z","lastTransitionTime":"2025-12-01T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.806645 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.806711 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.806730 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.806754 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.806773 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:19Z","lastTransitionTime":"2025-12-01T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.910816 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.910881 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.910898 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.910926 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:19 crc kubenswrapper[4931]: I1201 15:02:19.910945 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:19Z","lastTransitionTime":"2025-12-01T15:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.015034 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.015102 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.015119 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.015148 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.015171 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:20Z","lastTransitionTime":"2025-12-01T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.119353 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.119468 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.119487 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.119519 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.119537 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:20Z","lastTransitionTime":"2025-12-01T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.223007 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.223084 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.223103 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.223137 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.223157 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:20Z","lastTransitionTime":"2025-12-01T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.240990 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.241075 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.241141 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:20 crc kubenswrapper[4931]: E1201 15:02:20.241337 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:20 crc kubenswrapper[4931]: E1201 15:02:20.241608 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:20 crc kubenswrapper[4931]: E1201 15:02:20.241759 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.326172 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.326230 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.326250 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.326276 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.326295 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:20Z","lastTransitionTime":"2025-12-01T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.429182 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.429236 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.429256 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.429281 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.429301 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:20Z","lastTransitionTime":"2025-12-01T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.534044 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.534108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.534125 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.534154 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.534173 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:20Z","lastTransitionTime":"2025-12-01T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.636925 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.636993 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.637011 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.637035 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.637061 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:20Z","lastTransitionTime":"2025-12-01T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.740995 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.741062 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.741080 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.741109 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.741127 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:20Z","lastTransitionTime":"2025-12-01T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.844495 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.844574 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.844598 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.844632 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.844655 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:20Z","lastTransitionTime":"2025-12-01T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.947086 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.947156 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.947174 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.947201 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:20 crc kubenswrapper[4931]: I1201 15:02:20.947220 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:20Z","lastTransitionTime":"2025-12-01T15:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.049713 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.049782 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.049799 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.049874 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.049894 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:21Z","lastTransitionTime":"2025-12-01T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.152907 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.152968 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.152992 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.153022 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.153085 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:21Z","lastTransitionTime":"2025-12-01T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.240662 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:21 crc kubenswrapper[4931]: E1201 15:02:21.240868 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.256368 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.256454 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.256472 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.256496 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.256514 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:21Z","lastTransitionTime":"2025-12-01T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.359442 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.359575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.359601 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.359627 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.359644 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:21Z","lastTransitionTime":"2025-12-01T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.463458 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.463537 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.463561 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.463596 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.463621 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:21Z","lastTransitionTime":"2025-12-01T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.566944 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.566992 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.567003 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.567027 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.567050 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:21Z","lastTransitionTime":"2025-12-01T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.670506 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.670575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.670593 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.670621 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.670640 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:21Z","lastTransitionTime":"2025-12-01T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.774331 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.774410 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.774427 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.774450 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.774466 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:21Z","lastTransitionTime":"2025-12-01T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.877980 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.878456 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.878613 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.878754 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.878890 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:21Z","lastTransitionTime":"2025-12-01T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.982659 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.982733 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.982750 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.982773 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:21 crc kubenswrapper[4931]: I1201 15:02:21.982791 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:21Z","lastTransitionTime":"2025-12-01T15:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.086360 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.086466 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.086484 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.086512 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.086532 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:22Z","lastTransitionTime":"2025-12-01T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.189452 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.189527 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.189551 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.189587 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.189616 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:22Z","lastTransitionTime":"2025-12-01T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.240571 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.240645 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:22 crc kubenswrapper[4931]: E1201 15:02:22.240712 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:22 crc kubenswrapper[4931]: E1201 15:02:22.240821 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.240877 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:22 crc kubenswrapper[4931]: E1201 15:02:22.241027 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.293243 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.293356 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.293372 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.293426 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.293445 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:22Z","lastTransitionTime":"2025-12-01T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.396635 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.396733 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.396759 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.396788 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.396808 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:22Z","lastTransitionTime":"2025-12-01T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.500034 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.500530 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.500704 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.500847 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.500976 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:22Z","lastTransitionTime":"2025-12-01T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.604578 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.605561 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.605622 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.605654 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.605676 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:22Z","lastTransitionTime":"2025-12-01T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.709912 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.709974 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.709992 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.710021 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.710042 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:22Z","lastTransitionTime":"2025-12-01T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.813584 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.813642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.813660 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.813684 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.813701 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:22Z","lastTransitionTime":"2025-12-01T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.916521 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.916575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.916592 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.916615 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:22 crc kubenswrapper[4931]: I1201 15:02:22.916634 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:22Z","lastTransitionTime":"2025-12-01T15:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.019065 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.019123 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.019139 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.019161 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.019178 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:23Z","lastTransitionTime":"2025-12-01T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.121370 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.121467 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.121485 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.121510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.121527 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:23Z","lastTransitionTime":"2025-12-01T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.224425 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.224474 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.224486 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.224504 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.224520 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:23Z","lastTransitionTime":"2025-12-01T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.240536 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:23 crc kubenswrapper[4931]: E1201 15:02:23.240667 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.241887 4931 scope.go:117] "RemoveContainer" containerID="4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351" Dec 01 15:02:23 crc kubenswrapper[4931]: E1201 15:02:23.242189 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.326854 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.326929 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.326947 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.326972 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.326991 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:23Z","lastTransitionTime":"2025-12-01T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.429449 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.429498 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.429513 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.429532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.429545 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:23Z","lastTransitionTime":"2025-12-01T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.531919 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.531952 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.531966 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.531983 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.531993 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:23Z","lastTransitionTime":"2025-12-01T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.635696 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.635763 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.635780 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.635807 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.635826 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:23Z","lastTransitionTime":"2025-12-01T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.739446 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.739525 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.739549 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.739581 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.739605 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:23Z","lastTransitionTime":"2025-12-01T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.841987 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.842290 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.842478 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.842629 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.842771 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:23Z","lastTransitionTime":"2025-12-01T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.945753 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.945948 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.945968 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.945993 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:23 crc kubenswrapper[4931]: I1201 15:02:23.946015 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:23Z","lastTransitionTime":"2025-12-01T15:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.049506 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.049575 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.049595 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.049621 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.049646 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:24Z","lastTransitionTime":"2025-12-01T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.152783 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.152856 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.152874 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.152899 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.152917 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:24Z","lastTransitionTime":"2025-12-01T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.241140 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.241271 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:24 crc kubenswrapper[4931]: E1201 15:02:24.241429 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.241475 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:24 crc kubenswrapper[4931]: E1201 15:02:24.241560 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:24 crc kubenswrapper[4931]: E1201 15:02:24.241617 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.258354 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.258446 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.258471 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.258501 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.258522 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:24Z","lastTransitionTime":"2025-12-01T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.259769 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78dk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e105961-27de-4865-bd7b-44dd04d12034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7tmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78dk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.278607 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.295922 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c4a333-0668-4ab2-a049-ee67890fde06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://825f8f254baad923387374c5af4df73ee3517dbe50ec03d4ab824d260692d4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://266644e388b0a2b662912f43197e958f6f3d51c4a14d9dc615bc2ab644a35cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://266644e388b0a2b662912f43197e958f6f3d51c4a14d9dc615bc2ab644a35cd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.327428 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.345885 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.361377 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.361426 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.361436 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.361449 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.361459 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:24Z","lastTransitionTime":"2025-12-01T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.361784 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:02:01Z\\\",\\\"message\\\":\\\"2025-12-01T15:01:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda\\\\n2025-12-01T15:01:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda to /host/opt/cni/bin/\\\\n2025-12-01T15:01:16Z [verbose] multus-daemon started\\\\n2025-12-01T15:01:16Z [verbose] Readiness Indicator file check\\\\n2025-12-01T15:02:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.372744 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"daf46d9f-9b61-4808-ab42-392965da3a7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f478117131f7904d2db2618a60d19c859e96d73c8c052e305922a5ae512783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-crxtx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.384599 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1a8fdf6-a549-4875-9712-bab1069cfd7d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://025c658f18a3d31a7e390efd851d4046626d0fef53482ca56d9d1e3ce684744f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640292ba3e42fe63b35af563d05da7a5df12a4277ab98d12494bc30ec75966dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nqthv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t4vqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.396416 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.410620 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.422042 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.437210 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d17568d590afd0bc990fec36499d83ad05f2ddb6c7957384e6209d37afa82d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0799a8703596fbbd162e9563ca8f62a4793616572765d85a4f0485394106f366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.448657 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-k8x6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62446422-f8d8-45d1-81ef-4228b06c21eb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f59fac2fc00ba0f8630c64dac76c38fee144ed19f8aff3a1f9775929d6c6549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-k8x6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.460055 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2qrqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f29024b3-c46f-4ef0-8baa-89705f2171f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20ffd89cb8451ff4be7e538bdca85f226b342c04943271e7c1adba59a015da49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jd4tj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2qrqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.464510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.464582 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.464603 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.464631 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.464652 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:24Z","lastTransitionTime":"2025-12-01T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.490716 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:02:11Z\\\",\\\"message\\\":\\\"ork=default\\\\nI1201 15:02:11.330158 6965 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 15:02:11.330182 6965 services_controller.go:360] Finished syncing service cluster-monitoring-operator on namespace openshift-monitoring for network=default : 18.14µs\\\\nI1201 15:02:11.330202 6965 services_controller.go:356] Processing sync for service openshift-network-diagnostics/network-check-source for network=default\\\\nI1201 15:02:11.330237 6965 services_controller.go:360] Finished syncing service network-check-source on namespace openshift-network-diagnostics for network=default : 25.391µs\\\\nF1201 15:02:11.330248 6965 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:02:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8b56b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v5g28\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.507484 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14d40a20993641f586c74c85edb5ad6e882e7d6f3a32b652d16d31955a51ac30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.530369 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04108827-fec1-408b-8fba-feaa1175ed4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://191bc623cdd6c4a04aef60a1947aabe2229d908561fb0308d903e74b5409d425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a269b12da61eabd910a56fae75b59668b3bf42c1c4ac9d5aa961bdf93be3d056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cab54cbd41a1d19d7799dd639680e3404e6b41fecf9cfdf78b63b793d3034d78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdec3f62b66a65f68e84401dbfc7603c5625d440f5f841e77d321bdc37825fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://311988a0e0c1f6d7ad380c5e4e39b288c431428220b2c043f55787708b972eef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3605137a029bee61ee13ba3bf1fe59bdd4cf485b64763fb4729e6daad44ebad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a74a0eba6afa68229ac3278ca26561846d59ba4eae0eda705eacb6f843cb893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:01:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgdsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfb8b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.553949 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1299bbfe-9ffb-483a-ba5a-ea391efdc803\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.566972 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.567031 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.567050 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.567074 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.567093 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:24Z","lastTransitionTime":"2025-12-01T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.573228 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4924cb71312ff3a0678b501b5e67fd887f8fbb41458762749e17c77c83661549\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:24Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.670414 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.670831 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.670905 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.670985 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.671054 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:24Z","lastTransitionTime":"2025-12-01T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.773339 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.773437 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.773456 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.773478 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.773496 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:24Z","lastTransitionTime":"2025-12-01T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.876350 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.876448 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.876467 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.876494 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.876513 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:24Z","lastTransitionTime":"2025-12-01T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.979814 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.979878 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.979899 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.979922 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:24 crc kubenswrapper[4931]: I1201 15:02:24.979938 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:24Z","lastTransitionTime":"2025-12-01T15:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.091329 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.092107 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.092260 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.092565 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.092857 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:25Z","lastTransitionTime":"2025-12-01T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.196499 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.196629 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.196655 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.196688 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.196711 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:25Z","lastTransitionTime":"2025-12-01T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.241273 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:25 crc kubenswrapper[4931]: E1201 15:02:25.241506 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.299796 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.299866 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.299886 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.299912 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.299931 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:25Z","lastTransitionTime":"2025-12-01T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.403566 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.403625 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.403646 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.403672 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.403691 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:25Z","lastTransitionTime":"2025-12-01T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.506633 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.506685 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.506703 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.506731 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.506748 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:25Z","lastTransitionTime":"2025-12-01T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.609625 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.610023 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.610126 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.610233 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.610333 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:25Z","lastTransitionTime":"2025-12-01T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.713616 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.713682 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.713705 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.713736 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.713761 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:25Z","lastTransitionTime":"2025-12-01T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.816510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.816611 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.816630 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.816654 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.816672 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:25Z","lastTransitionTime":"2025-12-01T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.919254 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.919323 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.919346 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.919376 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:25 crc kubenswrapper[4931]: I1201 15:02:25.919436 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:25Z","lastTransitionTime":"2025-12-01T15:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.023138 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.023222 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.023246 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.023280 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.023300 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:26Z","lastTransitionTime":"2025-12-01T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.125272 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.125333 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.125351 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.125375 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.125416 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:26Z","lastTransitionTime":"2025-12-01T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.228478 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.228693 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.228864 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.229044 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.229218 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:26Z","lastTransitionTime":"2025-12-01T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.241563 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.241612 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.241704 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:26 crc kubenswrapper[4931]: E1201 15:02:26.241906 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:26 crc kubenswrapper[4931]: E1201 15:02:26.242279 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:26 crc kubenswrapper[4931]: E1201 15:02:26.242678 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.332353 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.332458 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.332483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.332512 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.332532 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:26Z","lastTransitionTime":"2025-12-01T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.435848 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.435904 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.435920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.435944 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.435962 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:26Z","lastTransitionTime":"2025-12-01T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.539731 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.539815 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.539855 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.539892 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.539918 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:26Z","lastTransitionTime":"2025-12-01T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.643314 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.643435 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.643460 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.643515 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.643536 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:26Z","lastTransitionTime":"2025-12-01T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.747819 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.747895 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.747919 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.747944 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.747961 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:26Z","lastTransitionTime":"2025-12-01T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.850783 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.850846 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.850864 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.850934 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.850957 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:26Z","lastTransitionTime":"2025-12-01T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.954220 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.954301 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.954326 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.954361 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:26 crc kubenswrapper[4931]: I1201 15:02:26.954414 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:26Z","lastTransitionTime":"2025-12-01T15:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.058458 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.058517 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.058530 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.058547 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.058559 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:27Z","lastTransitionTime":"2025-12-01T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.162319 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.162433 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.162507 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.162599 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.162653 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:27Z","lastTransitionTime":"2025-12-01T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.241181 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:27 crc kubenswrapper[4931]: E1201 15:02:27.241469 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.266507 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.266560 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.266578 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.266643 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.266667 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:27Z","lastTransitionTime":"2025-12-01T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.369594 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.369662 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.369679 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.369709 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.369734 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:27Z","lastTransitionTime":"2025-12-01T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.474849 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.474907 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.474920 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.474942 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.474963 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:27Z","lastTransitionTime":"2025-12-01T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.578549 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.578624 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.578646 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.578694 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.578728 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:27Z","lastTransitionTime":"2025-12-01T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.682553 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.682624 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.682643 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.682670 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.682698 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:27Z","lastTransitionTime":"2025-12-01T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.785574 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.785637 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.785652 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.785673 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.785688 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:27Z","lastTransitionTime":"2025-12-01T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.888761 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.888835 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.888851 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.888881 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.888903 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:27Z","lastTransitionTime":"2025-12-01T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.955988 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.956070 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.956093 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.956122 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.956146 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:27Z","lastTransitionTime":"2025-12-01T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:27 crc kubenswrapper[4931]: E1201 15:02:27.974550 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.980366 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.980483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.980502 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.980535 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:27 crc kubenswrapper[4931]: I1201 15:02:27.980555 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:27Z","lastTransitionTime":"2025-12-01T15:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:28 crc kubenswrapper[4931]: E1201 15:02:28.003887 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:27Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.011711 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.011822 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.011974 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.012015 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.012038 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:28Z","lastTransitionTime":"2025-12-01T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:28 crc kubenswrapper[4931]: E1201 15:02:28.031859 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.036867 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.036903 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.036914 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.036932 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.036943 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:28Z","lastTransitionTime":"2025-12-01T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:28 crc kubenswrapper[4931]: E1201 15:02:28.056047 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.061659 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.061716 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.061733 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.061756 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.061774 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:28Z","lastTransitionTime":"2025-12-01T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:28 crc kubenswrapper[4931]: E1201 15:02:28.078895 4931 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2309286a-3bdf-4d90-8920-f6c1244ed71c\\\",\\\"systemUUID\\\":\\\"a263e267-40f6-4472-9fe3-92cd328d0ad9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:28Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:28 crc kubenswrapper[4931]: E1201 15:02:28.079061 4931 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.081629 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.081703 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.081728 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.081760 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.081784 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:28Z","lastTransitionTime":"2025-12-01T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.184544 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.184604 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.184616 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.184635 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.184652 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:28Z","lastTransitionTime":"2025-12-01T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.241828 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.241946 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:28 crc kubenswrapper[4931]: E1201 15:02:28.242054 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:28 crc kubenswrapper[4931]: E1201 15:02:28.242170 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.241866 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:28 crc kubenswrapper[4931]: E1201 15:02:28.242316 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.287673 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.287742 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.287760 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.287785 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.287804 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:28Z","lastTransitionTime":"2025-12-01T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.391089 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.391163 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.391184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.391211 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.391229 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:28Z","lastTransitionTime":"2025-12-01T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.494921 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.495565 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.495796 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.495955 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.496111 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:28Z","lastTransitionTime":"2025-12-01T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.599795 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.599853 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.599870 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.599889 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.599903 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:28Z","lastTransitionTime":"2025-12-01T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.703331 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.703834 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.703986 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.704133 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.704284 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:28Z","lastTransitionTime":"2025-12-01T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.807219 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.807315 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.807335 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.807361 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.807379 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:28Z","lastTransitionTime":"2025-12-01T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.910642 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.910703 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.910725 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.910754 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:28 crc kubenswrapper[4931]: I1201 15:02:28.910777 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:28Z","lastTransitionTime":"2025-12-01T15:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.014201 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.014284 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.014306 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.014331 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.014350 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:29Z","lastTransitionTime":"2025-12-01T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.117767 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.117832 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.117850 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.117877 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.117894 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:29Z","lastTransitionTime":"2025-12-01T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.220339 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.220490 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.220516 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.220548 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.220572 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:29Z","lastTransitionTime":"2025-12-01T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.241129 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:29 crc kubenswrapper[4931]: E1201 15:02:29.241366 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.324342 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.324414 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.324427 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.324445 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.324458 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:29Z","lastTransitionTime":"2025-12-01T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.427163 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.427213 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.427224 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.427245 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.427256 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:29Z","lastTransitionTime":"2025-12-01T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.530078 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.530123 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.530141 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.530167 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.530187 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:29Z","lastTransitionTime":"2025-12-01T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.633100 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.633184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.633207 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.633235 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.633253 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:29Z","lastTransitionTime":"2025-12-01T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.736662 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.736741 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.736758 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.736783 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.736801 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:29Z","lastTransitionTime":"2025-12-01T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.839436 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.839483 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.839493 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.839510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.839519 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:29Z","lastTransitionTime":"2025-12-01T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.942761 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.942831 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.942848 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.942875 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:29 crc kubenswrapper[4931]: I1201 15:02:29.942893 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:29Z","lastTransitionTime":"2025-12-01T15:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.046239 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.046321 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.046342 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.046421 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.046444 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:30Z","lastTransitionTime":"2025-12-01T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.149491 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.149558 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.149570 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.149587 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.149598 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:30Z","lastTransitionTime":"2025-12-01T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.241026 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.241193 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.241466 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:30 crc kubenswrapper[4931]: E1201 15:02:30.241602 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:30 crc kubenswrapper[4931]: E1201 15:02:30.241770 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:30 crc kubenswrapper[4931]: E1201 15:02:30.241941 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.252137 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.252181 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.252200 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.252224 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.252243 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:30Z","lastTransitionTime":"2025-12-01T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.355665 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.355999 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.356071 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.356137 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.356203 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:30Z","lastTransitionTime":"2025-12-01T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.458908 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.458969 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.458986 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.459012 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.459032 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:30Z","lastTransitionTime":"2025-12-01T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.562244 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.562316 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.562341 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.562374 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.562442 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:30Z","lastTransitionTime":"2025-12-01T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.665589 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.665651 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.665670 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.665696 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.665713 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:30Z","lastTransitionTime":"2025-12-01T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.769490 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.769552 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.769569 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.769597 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.769616 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:30Z","lastTransitionTime":"2025-12-01T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.872465 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.872501 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.872510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.872525 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.872535 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:30Z","lastTransitionTime":"2025-12-01T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.975833 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.975875 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.975888 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.975905 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:30 crc kubenswrapper[4931]: I1201 15:02:30.975916 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:30Z","lastTransitionTime":"2025-12-01T15:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.079462 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.079520 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.079536 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.079560 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.079578 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:31Z","lastTransitionTime":"2025-12-01T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.182915 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.182985 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.183006 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.183033 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.183052 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:31Z","lastTransitionTime":"2025-12-01T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.241281 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:31 crc kubenswrapper[4931]: E1201 15:02:31.241590 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.285720 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.285808 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.285836 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.285871 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.285898 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:31Z","lastTransitionTime":"2025-12-01T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.388888 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.388963 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.388984 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.389014 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.389033 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:31Z","lastTransitionTime":"2025-12-01T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.491781 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.491855 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.491876 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.491903 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.491922 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:31Z","lastTransitionTime":"2025-12-01T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.595837 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.595898 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.595912 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.595935 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.595954 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:31Z","lastTransitionTime":"2025-12-01T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.699208 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.699273 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.699289 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.699312 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.699327 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:31Z","lastTransitionTime":"2025-12-01T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.802431 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.802509 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.802532 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.802564 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.802583 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:31Z","lastTransitionTime":"2025-12-01T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.905609 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.905671 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.905688 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.905712 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:31 crc kubenswrapper[4931]: I1201 15:02:31.905728 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:31Z","lastTransitionTime":"2025-12-01T15:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.009376 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.009499 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.009524 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.009557 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.009577 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:32Z","lastTransitionTime":"2025-12-01T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.020121 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:32 crc kubenswrapper[4931]: E1201 15:02:32.020301 4931 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:02:32 crc kubenswrapper[4931]: E1201 15:02:32.020421 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs podName:2e105961-27de-4865-bd7b-44dd04d12034 nodeName:}" failed. No retries permitted until 2025-12-01 15:03:36.020356226 +0000 UTC m=+162.446229923 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs") pod "network-metrics-daemon-78dk9" (UID: "2e105961-27de-4865-bd7b-44dd04d12034") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.113434 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.113507 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.113525 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.113553 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.113576 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:32Z","lastTransitionTime":"2025-12-01T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.217119 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.217187 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.217206 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.217231 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.217250 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:32Z","lastTransitionTime":"2025-12-01T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.241318 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.241342 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.241363 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:32 crc kubenswrapper[4931]: E1201 15:02:32.241757 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:32 crc kubenswrapper[4931]: E1201 15:02:32.241938 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:32 crc kubenswrapper[4931]: E1201 15:02:32.242090 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.320815 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.320887 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.320912 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.320942 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.320966 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:32Z","lastTransitionTime":"2025-12-01T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.424220 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.424292 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.424309 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.424335 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.424352 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:32Z","lastTransitionTime":"2025-12-01T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.528312 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.528432 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.528452 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.528478 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.528497 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:32Z","lastTransitionTime":"2025-12-01T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.631621 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.631693 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.631712 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.631730 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.631741 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:32Z","lastTransitionTime":"2025-12-01T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.734637 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.734696 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.734713 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.734738 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.734756 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:32Z","lastTransitionTime":"2025-12-01T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.837645 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.837988 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.838011 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.838035 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.838053 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:32Z","lastTransitionTime":"2025-12-01T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.941333 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.941414 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.941432 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.941455 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:32 crc kubenswrapper[4931]: I1201 15:02:32.941475 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:32Z","lastTransitionTime":"2025-12-01T15:02:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.045724 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.045786 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.045800 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.045820 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.045835 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:33Z","lastTransitionTime":"2025-12-01T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.148622 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.148706 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.148733 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.148766 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.148791 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:33Z","lastTransitionTime":"2025-12-01T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.241305 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:33 crc kubenswrapper[4931]: E1201 15:02:33.241584 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.251861 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.251902 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.251915 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.251932 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.251947 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:33Z","lastTransitionTime":"2025-12-01T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.355290 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.355419 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.355446 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.355478 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.355500 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:33Z","lastTransitionTime":"2025-12-01T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.457895 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.457947 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.457964 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.457986 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.458003 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:33Z","lastTransitionTime":"2025-12-01T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.561466 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.561541 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.561559 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.561585 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.561605 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:33Z","lastTransitionTime":"2025-12-01T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.664817 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.664906 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.664929 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.664951 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.664968 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:33Z","lastTransitionTime":"2025-12-01T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.768326 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.768419 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.768439 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.768473 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.768494 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:33Z","lastTransitionTime":"2025-12-01T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.872179 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.872240 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.872258 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.872283 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.872300 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:33Z","lastTransitionTime":"2025-12-01T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.976048 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.977064 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.977222 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.977373 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:33 crc kubenswrapper[4931]: I1201 15:02:33.977553 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:33Z","lastTransitionTime":"2025-12-01T15:02:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.080873 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.081256 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.081458 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.081616 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.081744 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:34Z","lastTransitionTime":"2025-12-01T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.184512 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.184572 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.184587 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.184612 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.184630 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:34Z","lastTransitionTime":"2025-12-01T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.241213 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:34 crc kubenswrapper[4931]: E1201 15:02:34.241657 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.241797 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.241836 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:34 crc kubenswrapper[4931]: E1201 15:02:34.242835 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:34 crc kubenswrapper[4931]: E1201 15:02:34.242964 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.259257 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c347daf-a75c-466d-ad40-309727de9c72\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cee69d1d65f4c4092b83ff602f0f4ee7d889c2c79c8484825085de973183415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8826478d61a03be23609ebcaae492fa0ef2b0f932ca5798f9c5ee7f254e768eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad736a5c5f94fc2cc7e4a831058157f7c25577372b71657f4e30c3240026330c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://588ccf6c38f6193b2ff364aa3bf6e3134702774fbc2811d547eb3d735b84bceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.274893 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c4a333-0668-4ab2-a049-ee67890fde06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://825f8f254baad923387374c5af4df73ee3517dbe50ec03d4ab824d260692d4b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://266644e388b0a2b662912f43197e958f6f3d51c4a14d9dc615bc2ab644a35cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://266644e388b0a2b662912f43197e958f6f3d51c4a14d9dc615bc2ab644a35cd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.287917 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.288146 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.288307 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.288522 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.288687 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:34Z","lastTransitionTime":"2025-12-01T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.307532 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61862f0-3e18-4654-a543-ad1a4c958781\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0389455f5f6fcd117c434f21dcef5c6569ed5a88b4e769c3e29461c631669b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://daa7e2c3433a8d1e03e9e57d087a6ee56bfde5171dc24a87ef20ea12ffca3310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8bfa25d2bd8873284b83b2d42db4db2a18f9cdf24e4d297d87f5e4735253d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7bc075cbdae2dd0c8a2338a0930903f07485e4fa326fc76e13972408a915754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44d04ad117f1c728bc67876ce1c53b4e59ddfe9fcd38df282b2392173a7a3c6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e17721ba3953448bb87c3e51d9b620cbfee17e619c5e5461c9c3e38897225ddf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f6f41438591765f225bc184d6a360dc334d10b4a29c5e98ec3838f720da8d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ffc4fd4c9f1fe52f57f56d59e59a1df1e5eefd3446bc68f98ea095264e2596c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T15:00:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.325203 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4b49c2e-b9e6-4585-a583-349b417cf0ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:00:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6f023b5499357d5a53adc43600550dc79a7c438e54142a7b0b658903d289043\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876d66e73156b986db7c3473bd172a4aaff0f3c89d9456d1ccd7aa86252e1022\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac57e31c3d6b450479325bd55899d7513016545a45da9b2089287381e9ebe9d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:00:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.342598 4931 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6nwqj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db092a9c-f0f2-401d-82dd-b3af535585cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:01:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T15:02:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T15:02:01Z\\\",\\\"message\\\":\\\"2025-12-01T15:01:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda\\\\n2025-12-01T15:01:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b2be9168-df07-4635-a6e8-a03c17c2bcda to /host/opt/cni/bin/\\\\n2025-12-01T15:01:16Z [verbose] multus-daemon started\\\\n2025-12-01T15:01:16Z [verbose] Readiness Indicator file check\\\\n2025-12-01T15:02:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T15:01:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T15:02:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hh4ff\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T15:01:14Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6nwqj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T15:02:34Z is after 2025-08-24T17:21:41Z" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.391239 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.391293 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.391309 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.391332 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.391348 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:34Z","lastTransitionTime":"2025-12-01T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.494758 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.494821 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.494838 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.494860 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.494873 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:34Z","lastTransitionTime":"2025-12-01T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.497997 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podStartSLOduration=81.497971762 podStartE2EDuration="1m21.497971762s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:02:34.483294154 +0000 UTC m=+100.909167861" watchObservedRunningTime="2025-12-01 15:02:34.497971762 +0000 UTC m=+100.923845429" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.512728 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t4vqp" podStartSLOduration=80.512705592 podStartE2EDuration="1m20.512705592s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:02:34.498582881 +0000 UTC m=+100.924456548" watchObservedRunningTime="2025-12-01 15:02:34.512705592 +0000 UTC m=+100.938579259" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.586251 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-k8x6d" podStartSLOduration=81.586226298 podStartE2EDuration="1m21.586226298s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:02:34.586216288 +0000 UTC m=+101.012089965" watchObservedRunningTime="2025-12-01 15:02:34.586226298 +0000 UTC m=+101.012099985" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.597309 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.597551 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.597641 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.597725 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.597801 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:34Z","lastTransitionTime":"2025-12-01T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.599105 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2qrqd" podStartSLOduration=81.599089992 podStartE2EDuration="1m21.599089992s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:02:34.596974609 +0000 UTC m=+101.022848286" watchObservedRunningTime="2025-12-01 15:02:34.599089992 +0000 UTC m=+101.024963669" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.653599 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nfb8b" podStartSLOduration=80.653573369 podStartE2EDuration="1m20.653573369s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:02:34.652990192 +0000 UTC m=+101.078863859" watchObservedRunningTime="2025-12-01 15:02:34.653573369 +0000 UTC m=+101.079447036" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.670315 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=82.670297628 podStartE2EDuration="1m22.670297628s" podCreationTimestamp="2025-12-01 15:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:02:34.669852245 +0000 UTC m=+101.095725942" watchObservedRunningTime="2025-12-01 15:02:34.670297628 +0000 UTC m=+101.096171295" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.700638 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.700681 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.700692 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.700706 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.700715 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:34Z","lastTransitionTime":"2025-12-01T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.802452 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.802499 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.802508 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.802522 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.802531 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:34Z","lastTransitionTime":"2025-12-01T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.905248 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.905310 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.905326 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.905351 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:34 crc kubenswrapper[4931]: I1201 15:02:34.905366 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:34Z","lastTransitionTime":"2025-12-01T15:02:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.007771 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.007838 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.007852 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.007873 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.007887 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:35Z","lastTransitionTime":"2025-12-01T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.110328 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.110406 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.110420 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.110443 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.110455 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:35Z","lastTransitionTime":"2025-12-01T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.212869 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.212931 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.212948 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.212972 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.212994 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:35Z","lastTransitionTime":"2025-12-01T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.240412 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:35 crc kubenswrapper[4931]: E1201 15:02:35.240588 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.316443 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.316497 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.316510 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.316527 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.316540 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:35Z","lastTransitionTime":"2025-12-01T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.419544 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.419600 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.419614 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.419635 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.419648 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:35Z","lastTransitionTime":"2025-12-01T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.525249 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.525331 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.525351 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.525381 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.525431 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:35Z","lastTransitionTime":"2025-12-01T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.628311 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.628408 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.628427 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.628454 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.628472 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:35Z","lastTransitionTime":"2025-12-01T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.732166 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.732232 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.732250 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.732278 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.732297 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:35Z","lastTransitionTime":"2025-12-01T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.835106 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.835153 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.835165 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.835182 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.835195 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:35Z","lastTransitionTime":"2025-12-01T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.941615 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.941682 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.941703 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.941727 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:35 crc kubenswrapper[4931]: I1201 15:02:35.941746 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:35Z","lastTransitionTime":"2025-12-01T15:02:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.044918 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.044992 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.045016 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.045067 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.045093 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:36Z","lastTransitionTime":"2025-12-01T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.148790 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.148856 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.148876 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.148902 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.148954 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:36Z","lastTransitionTime":"2025-12-01T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.240525 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.240577 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.240525 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:36 crc kubenswrapper[4931]: E1201 15:02:36.240686 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:36 crc kubenswrapper[4931]: E1201 15:02:36.240868 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:36 crc kubenswrapper[4931]: E1201 15:02:36.241045 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.241888 4931 scope.go:117] "RemoveContainer" containerID="4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351" Dec 01 15:02:36 crc kubenswrapper[4931]: E1201 15:02:36.242072 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.251752 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.251849 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.251901 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.251928 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.251976 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:36Z","lastTransitionTime":"2025-12-01T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.354913 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.354974 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.354992 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.355059 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.355081 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:36Z","lastTransitionTime":"2025-12-01T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.458839 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.458893 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.458911 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.458936 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.458955 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:36Z","lastTransitionTime":"2025-12-01T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.562479 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.562550 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.562573 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.562602 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.562620 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:36Z","lastTransitionTime":"2025-12-01T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.666309 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.666420 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.666441 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.666466 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.666483 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:36Z","lastTransitionTime":"2025-12-01T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.769951 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.769991 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.770018 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.770034 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.770042 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:36Z","lastTransitionTime":"2025-12-01T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.872989 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.873050 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.873068 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.873095 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.873111 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:36Z","lastTransitionTime":"2025-12-01T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.976833 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.976911 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.976934 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.976961 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:36 crc kubenswrapper[4931]: I1201 15:02:36.976979 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:36Z","lastTransitionTime":"2025-12-01T15:02:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.079998 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.080081 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.080108 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.080217 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.080262 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:37Z","lastTransitionTime":"2025-12-01T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.183954 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.184022 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.184038 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.184066 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.184089 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:37Z","lastTransitionTime":"2025-12-01T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.241082 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:37 crc kubenswrapper[4931]: E1201 15:02:37.241279 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.287413 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.287459 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.287478 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.287501 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.287520 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:37Z","lastTransitionTime":"2025-12-01T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.390578 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.390646 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.390665 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.390690 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.390710 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:37Z","lastTransitionTime":"2025-12-01T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.493102 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.493184 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.493213 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.493239 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.493256 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:37Z","lastTransitionTime":"2025-12-01T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.596165 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.596229 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.596247 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.596271 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.596290 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:37Z","lastTransitionTime":"2025-12-01T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.699490 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.699549 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.699565 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.699591 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.699610 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:37Z","lastTransitionTime":"2025-12-01T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.802066 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.802143 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.802166 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.802199 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.802222 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:37Z","lastTransitionTime":"2025-12-01T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.904421 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.904463 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.904475 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.904493 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:37 crc kubenswrapper[4931]: I1201 15:02:37.904506 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:37Z","lastTransitionTime":"2025-12-01T15:02:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.006746 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.006840 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.006859 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.006913 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.006931 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:38Z","lastTransitionTime":"2025-12-01T15:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.108874 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.108911 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.108921 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.108935 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.108945 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:38Z","lastTransitionTime":"2025-12-01T15:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.116892 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.116944 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.116955 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.116972 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.116983 4931 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T15:02:38Z","lastTransitionTime":"2025-12-01T15:02:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.209719 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr"] Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.210287 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.214225 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.214544 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.215498 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.216469 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.231857 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=57.231833933 podStartE2EDuration="57.231833933s" podCreationTimestamp="2025-12-01 15:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:02:38.229933286 +0000 UTC m=+104.655806973" watchObservedRunningTime="2025-12-01 15:02:38.231833933 +0000 UTC m=+104.657707620" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.240826 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.240902 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:38 crc kubenswrapper[4931]: E1201 15:02:38.240950 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.240898 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:38 crc kubenswrapper[4931]: E1201 15:02:38.241072 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:38 crc kubenswrapper[4931]: E1201 15:02:38.241238 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.277793 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=83.277771935 podStartE2EDuration="1m23.277771935s" podCreationTimestamp="2025-12-01 15:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:02:38.276898689 +0000 UTC m=+104.702772366" watchObservedRunningTime="2025-12-01 15:02:38.277771935 +0000 UTC m=+104.703645612" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.277973 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=23.277968961 podStartE2EDuration="23.277968961s" podCreationTimestamp="2025-12-01 15:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:02:38.248111759 +0000 UTC m=+104.673985446" watchObservedRunningTime="2025-12-01 15:02:38.277968961 +0000 UTC m=+104.703842638" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.293293 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/02568dbb-0619-47ef-b5c5-0c619205dd0c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.293358 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02568dbb-0619-47ef-b5c5-0c619205dd0c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.293404 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/02568dbb-0619-47ef-b5c5-0c619205dd0c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.293442 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02568dbb-0619-47ef-b5c5-0c619205dd0c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.293465 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02568dbb-0619-47ef-b5c5-0c619205dd0c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.294175 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=82.294113883 podStartE2EDuration="1m22.294113883s" podCreationTimestamp="2025-12-01 15:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:02:38.293821394 +0000 UTC m=+104.719695071" watchObservedRunningTime="2025-12-01 15:02:38.294113883 +0000 UTC m=+104.719987570" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.317145 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6nwqj" podStartSLOduration=85.31713027 podStartE2EDuration="1m25.31713027s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:02:38.315853372 +0000 UTC m=+104.741727059" watchObservedRunningTime="2025-12-01 15:02:38.31713027 +0000 UTC m=+104.743003947" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.394849 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02568dbb-0619-47ef-b5c5-0c619205dd0c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.394907 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02568dbb-0619-47ef-b5c5-0c619205dd0c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.394965 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/02568dbb-0619-47ef-b5c5-0c619205dd0c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.394997 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02568dbb-0619-47ef-b5c5-0c619205dd0c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.395027 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/02568dbb-0619-47ef-b5c5-0c619205dd0c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.395090 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/02568dbb-0619-47ef-b5c5-0c619205dd0c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.395795 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/02568dbb-0619-47ef-b5c5-0c619205dd0c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.396335 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02568dbb-0619-47ef-b5c5-0c619205dd0c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.401850 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02568dbb-0619-47ef-b5c5-0c619205dd0c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.420000 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02568dbb-0619-47ef-b5c5-0c619205dd0c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-599sr\" (UID: \"02568dbb-0619-47ef-b5c5-0c619205dd0c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.537079 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" Dec 01 15:02:38 crc kubenswrapper[4931]: W1201 15:02:38.551766 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02568dbb_0619_47ef_b5c5_0c619205dd0c.slice/crio-d427b4e71652a7de756b296e9b3387be4b7752b9c3482b9741a50c0a39b44a20 WatchSource:0}: Error finding container d427b4e71652a7de756b296e9b3387be4b7752b9c3482b9741a50c0a39b44a20: Status 404 returned error can't find the container with id d427b4e71652a7de756b296e9b3387be4b7752b9c3482b9741a50c0a39b44a20 Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.825260 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" event={"ID":"02568dbb-0619-47ef-b5c5-0c619205dd0c","Type":"ContainerStarted","Data":"b8084750ef3c1e40c222be74c4fe285daa6ffb5b6d6c2317ccd85894aa4b1a49"} Dec 01 15:02:38 crc kubenswrapper[4931]: I1201 15:02:38.825313 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" event={"ID":"02568dbb-0619-47ef-b5c5-0c619205dd0c","Type":"ContainerStarted","Data":"d427b4e71652a7de756b296e9b3387be4b7752b9c3482b9741a50c0a39b44a20"} Dec 01 15:02:39 crc kubenswrapper[4931]: I1201 15:02:39.241149 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:39 crc kubenswrapper[4931]: E1201 15:02:39.241293 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:40 crc kubenswrapper[4931]: I1201 15:02:40.241419 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:40 crc kubenswrapper[4931]: I1201 15:02:40.241498 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:40 crc kubenswrapper[4931]: I1201 15:02:40.241566 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:40 crc kubenswrapper[4931]: E1201 15:02:40.241612 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:40 crc kubenswrapper[4931]: E1201 15:02:40.241721 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:40 crc kubenswrapper[4931]: E1201 15:02:40.241821 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:41 crc kubenswrapper[4931]: I1201 15:02:41.241014 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:41 crc kubenswrapper[4931]: E1201 15:02:41.241276 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:42 crc kubenswrapper[4931]: I1201 15:02:42.240663 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:42 crc kubenswrapper[4931]: I1201 15:02:42.240714 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:42 crc kubenswrapper[4931]: I1201 15:02:42.240714 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:42 crc kubenswrapper[4931]: E1201 15:02:42.240948 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:42 crc kubenswrapper[4931]: E1201 15:02:42.241098 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:42 crc kubenswrapper[4931]: E1201 15:02:42.241164 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:43 crc kubenswrapper[4931]: I1201 15:02:43.241518 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:43 crc kubenswrapper[4931]: E1201 15:02:43.241703 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:44 crc kubenswrapper[4931]: I1201 15:02:44.240828 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:44 crc kubenswrapper[4931]: I1201 15:02:44.240891 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:44 crc kubenswrapper[4931]: I1201 15:02:44.240926 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:44 crc kubenswrapper[4931]: E1201 15:02:44.241860 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:44 crc kubenswrapper[4931]: E1201 15:02:44.242005 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:44 crc kubenswrapper[4931]: E1201 15:02:44.242155 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:45 crc kubenswrapper[4931]: I1201 15:02:45.240901 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:45 crc kubenswrapper[4931]: E1201 15:02:45.241446 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:46 crc kubenswrapper[4931]: I1201 15:02:46.240868 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:46 crc kubenswrapper[4931]: I1201 15:02:46.241006 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:46 crc kubenswrapper[4931]: I1201 15:02:46.241007 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:46 crc kubenswrapper[4931]: E1201 15:02:46.241183 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:46 crc kubenswrapper[4931]: E1201 15:02:46.241359 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:46 crc kubenswrapper[4931]: E1201 15:02:46.241538 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:47 crc kubenswrapper[4931]: I1201 15:02:47.240977 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:47 crc kubenswrapper[4931]: E1201 15:02:47.241163 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:47 crc kubenswrapper[4931]: I1201 15:02:47.871023 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6nwqj_db092a9c-f0f2-401d-82dd-b3af535585cc/kube-multus/1.log" Dec 01 15:02:47 crc kubenswrapper[4931]: I1201 15:02:47.871486 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6nwqj_db092a9c-f0f2-401d-82dd-b3af535585cc/kube-multus/0.log" Dec 01 15:02:47 crc kubenswrapper[4931]: I1201 15:02:47.871519 4931 generic.go:334] "Generic (PLEG): container finished" podID="db092a9c-f0f2-401d-82dd-b3af535585cc" containerID="056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991" exitCode=1 Dec 01 15:02:47 crc kubenswrapper[4931]: I1201 15:02:47.871549 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6nwqj" event={"ID":"db092a9c-f0f2-401d-82dd-b3af535585cc","Type":"ContainerDied","Data":"056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991"} Dec 01 15:02:47 crc kubenswrapper[4931]: I1201 15:02:47.871587 4931 scope.go:117] "RemoveContainer" containerID="a59886201c02453d3f7bbb38d6fda679ef42009484bf6334c4c859cd800c45b8" Dec 01 15:02:47 crc kubenswrapper[4931]: I1201 15:02:47.872000 4931 scope.go:117] "RemoveContainer" containerID="056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991" Dec 01 15:02:47 crc kubenswrapper[4931]: E1201 15:02:47.872150 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-6nwqj_openshift-multus(db092a9c-f0f2-401d-82dd-b3af535585cc)\"" pod="openshift-multus/multus-6nwqj" podUID="db092a9c-f0f2-401d-82dd-b3af535585cc" Dec 01 15:02:47 crc kubenswrapper[4931]: I1201 15:02:47.896097 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-599sr" podStartSLOduration=94.89606588 podStartE2EDuration="1m34.89606588s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:02:38.844120328 +0000 UTC m=+105.269994035" watchObservedRunningTime="2025-12-01 15:02:47.89606588 +0000 UTC m=+114.321939617" Dec 01 15:02:48 crc kubenswrapper[4931]: I1201 15:02:48.240541 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:48 crc kubenswrapper[4931]: I1201 15:02:48.240541 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:48 crc kubenswrapper[4931]: E1201 15:02:48.240793 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:48 crc kubenswrapper[4931]: I1201 15:02:48.240880 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:48 crc kubenswrapper[4931]: E1201 15:02:48.241037 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:48 crc kubenswrapper[4931]: E1201 15:02:48.241180 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:48 crc kubenswrapper[4931]: I1201 15:02:48.905548 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6nwqj_db092a9c-f0f2-401d-82dd-b3af535585cc/kube-multus/1.log" Dec 01 15:02:49 crc kubenswrapper[4931]: I1201 15:02:49.240984 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:49 crc kubenswrapper[4931]: E1201 15:02:49.241164 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:50 crc kubenswrapper[4931]: I1201 15:02:50.240481 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:50 crc kubenswrapper[4931]: I1201 15:02:50.240614 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:50 crc kubenswrapper[4931]: E1201 15:02:50.240682 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:50 crc kubenswrapper[4931]: E1201 15:02:50.240838 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:50 crc kubenswrapper[4931]: I1201 15:02:50.241442 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:50 crc kubenswrapper[4931]: E1201 15:02:50.241610 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:50 crc kubenswrapper[4931]: I1201 15:02:50.242019 4931 scope.go:117] "RemoveContainer" containerID="4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351" Dec 01 15:02:50 crc kubenswrapper[4931]: E1201 15:02:50.242524 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v5g28_openshift-ovn-kubernetes(16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" Dec 01 15:02:51 crc kubenswrapper[4931]: I1201 15:02:51.241222 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:51 crc kubenswrapper[4931]: E1201 15:02:51.241463 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:52 crc kubenswrapper[4931]: I1201 15:02:52.240865 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:52 crc kubenswrapper[4931]: I1201 15:02:52.241038 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:52 crc kubenswrapper[4931]: E1201 15:02:52.241045 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:52 crc kubenswrapper[4931]: E1201 15:02:52.241128 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:52 crc kubenswrapper[4931]: I1201 15:02:52.241622 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:52 crc kubenswrapper[4931]: E1201 15:02:52.241850 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:53 crc kubenswrapper[4931]: I1201 15:02:53.240666 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:53 crc kubenswrapper[4931]: E1201 15:02:53.240818 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:54 crc kubenswrapper[4931]: I1201 15:02:54.240661 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:54 crc kubenswrapper[4931]: E1201 15:02:54.244373 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:54 crc kubenswrapper[4931]: I1201 15:02:54.244466 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:54 crc kubenswrapper[4931]: I1201 15:02:54.244523 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:54 crc kubenswrapper[4931]: E1201 15:02:54.244723 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:54 crc kubenswrapper[4931]: E1201 15:02:54.244899 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:54 crc kubenswrapper[4931]: E1201 15:02:54.274184 4931 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 01 15:02:54 crc kubenswrapper[4931]: E1201 15:02:54.352535 4931 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 15:02:55 crc kubenswrapper[4931]: I1201 15:02:55.241294 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:55 crc kubenswrapper[4931]: E1201 15:02:55.242095 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:56 crc kubenswrapper[4931]: I1201 15:02:56.241242 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:56 crc kubenswrapper[4931]: I1201 15:02:56.241303 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:56 crc kubenswrapper[4931]: I1201 15:02:56.241242 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:56 crc kubenswrapper[4931]: E1201 15:02:56.241477 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:56 crc kubenswrapper[4931]: E1201 15:02:56.241635 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:56 crc kubenswrapper[4931]: E1201 15:02:56.241800 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:57 crc kubenswrapper[4931]: I1201 15:02:57.241613 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:57 crc kubenswrapper[4931]: E1201 15:02:57.242736 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:58 crc kubenswrapper[4931]: I1201 15:02:58.240705 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:02:58 crc kubenswrapper[4931]: I1201 15:02:58.240804 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:02:58 crc kubenswrapper[4931]: E1201 15:02:58.240865 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:02:58 crc kubenswrapper[4931]: I1201 15:02:58.240940 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:02:58 crc kubenswrapper[4931]: E1201 15:02:58.241005 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:02:58 crc kubenswrapper[4931]: E1201 15:02:58.241127 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:02:59 crc kubenswrapper[4931]: I1201 15:02:59.241369 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:02:59 crc kubenswrapper[4931]: E1201 15:02:59.241521 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:02:59 crc kubenswrapper[4931]: E1201 15:02:59.354621 4931 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 15:03:00 crc kubenswrapper[4931]: I1201 15:03:00.240693 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:03:00 crc kubenswrapper[4931]: I1201 15:03:00.240693 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:03:00 crc kubenswrapper[4931]: I1201 15:03:00.240790 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:03:00 crc kubenswrapper[4931]: E1201 15:03:00.240959 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:03:00 crc kubenswrapper[4931]: E1201 15:03:00.241147 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:03:00 crc kubenswrapper[4931]: E1201 15:03:00.241221 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:03:01 crc kubenswrapper[4931]: I1201 15:03:01.240839 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:03:01 crc kubenswrapper[4931]: E1201 15:03:01.241062 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:03:02 crc kubenswrapper[4931]: I1201 15:03:02.241019 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:03:02 crc kubenswrapper[4931]: E1201 15:03:02.241219 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:03:02 crc kubenswrapper[4931]: I1201 15:03:02.241548 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:03:02 crc kubenswrapper[4931]: E1201 15:03:02.241645 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:03:02 crc kubenswrapper[4931]: I1201 15:03:02.241774 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:03:02 crc kubenswrapper[4931]: E1201 15:03:02.241979 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:03:03 crc kubenswrapper[4931]: I1201 15:03:03.241053 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:03:03 crc kubenswrapper[4931]: I1201 15:03:03.241679 4931 scope.go:117] "RemoveContainer" containerID="056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991" Dec 01 15:03:03 crc kubenswrapper[4931]: E1201 15:03:03.241849 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:03:03 crc kubenswrapper[4931]: I1201 15:03:03.962184 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6nwqj_db092a9c-f0f2-401d-82dd-b3af535585cc/kube-multus/1.log" Dec 01 15:03:03 crc kubenswrapper[4931]: I1201 15:03:03.962272 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6nwqj" event={"ID":"db092a9c-f0f2-401d-82dd-b3af535585cc","Type":"ContainerStarted","Data":"58e0cafadf10e2f6c28ad954b6ef10668446085bb039d922999d395643c4d133"} Dec 01 15:03:04 crc kubenswrapper[4931]: I1201 15:03:04.240824 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:03:04 crc kubenswrapper[4931]: I1201 15:03:04.240940 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:03:04 crc kubenswrapper[4931]: E1201 15:03:04.241812 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:03:04 crc kubenswrapper[4931]: I1201 15:03:04.241905 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:03:04 crc kubenswrapper[4931]: E1201 15:03:04.242035 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:03:04 crc kubenswrapper[4931]: E1201 15:03:04.242079 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:03:04 crc kubenswrapper[4931]: E1201 15:03:04.355425 4931 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 15:03:05 crc kubenswrapper[4931]: I1201 15:03:05.241246 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:03:05 crc kubenswrapper[4931]: E1201 15:03:05.241419 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:03:05 crc kubenswrapper[4931]: I1201 15:03:05.242669 4931 scope.go:117] "RemoveContainer" containerID="4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351" Dec 01 15:03:05 crc kubenswrapper[4931]: I1201 15:03:05.971643 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/3.log" Dec 01 15:03:05 crc kubenswrapper[4931]: I1201 15:03:05.974439 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerStarted","Data":"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872"} Dec 01 15:03:05 crc kubenswrapper[4931]: I1201 15:03:05.974933 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:03:05 crc kubenswrapper[4931]: I1201 15:03:05.983829 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-78dk9"] Dec 01 15:03:05 crc kubenswrapper[4931]: I1201 15:03:05.983959 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:03:05 crc kubenswrapper[4931]: E1201 15:03:05.984055 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:03:06 crc kubenswrapper[4931]: I1201 15:03:06.015730 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podStartSLOduration=112.015710326 podStartE2EDuration="1m52.015710326s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:06.014049509 +0000 UTC m=+132.439923176" watchObservedRunningTime="2025-12-01 15:03:06.015710326 +0000 UTC m=+132.441583993" Dec 01 15:03:06 crc kubenswrapper[4931]: I1201 15:03:06.241364 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:03:06 crc kubenswrapper[4931]: I1201 15:03:06.241479 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:03:06 crc kubenswrapper[4931]: E1201 15:03:06.241835 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:03:06 crc kubenswrapper[4931]: E1201 15:03:06.241988 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:03:07 crc kubenswrapper[4931]: I1201 15:03:07.241363 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:03:07 crc kubenswrapper[4931]: I1201 15:03:07.241412 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:03:07 crc kubenswrapper[4931]: E1201 15:03:07.241518 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:03:07 crc kubenswrapper[4931]: E1201 15:03:07.241731 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:03:08 crc kubenswrapper[4931]: I1201 15:03:08.240801 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:03:08 crc kubenswrapper[4931]: I1201 15:03:08.240815 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:03:08 crc kubenswrapper[4931]: E1201 15:03:08.241164 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 15:03:08 crc kubenswrapper[4931]: E1201 15:03:08.241316 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 15:03:09 crc kubenswrapper[4931]: I1201 15:03:09.241332 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:03:09 crc kubenswrapper[4931]: I1201 15:03:09.241332 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:03:09 crc kubenswrapper[4931]: E1201 15:03:09.241574 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78dk9" podUID="2e105961-27de-4865-bd7b-44dd04d12034" Dec 01 15:03:09 crc kubenswrapper[4931]: E1201 15:03:09.241642 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 15:03:10 crc kubenswrapper[4931]: I1201 15:03:10.241179 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:03:10 crc kubenswrapper[4931]: I1201 15:03:10.241190 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:03:10 crc kubenswrapper[4931]: I1201 15:03:10.246129 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 15:03:10 crc kubenswrapper[4931]: I1201 15:03:10.246777 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 15:03:10 crc kubenswrapper[4931]: I1201 15:03:10.247028 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 15:03:10 crc kubenswrapper[4931]: I1201 15:03:10.248288 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 15:03:11 crc kubenswrapper[4931]: I1201 15:03:11.241464 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:03:11 crc kubenswrapper[4931]: I1201 15:03:11.241542 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:03:11 crc kubenswrapper[4931]: I1201 15:03:11.245311 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 15:03:11 crc kubenswrapper[4931]: I1201 15:03:11.245956 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 15:03:16 crc kubenswrapper[4931]: I1201 15:03:16.315512 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:03:18 crc kubenswrapper[4931]: I1201 15:03:18.964037 4931 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.026657 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2tbqk"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.027670 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.029511 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.030589 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.031807 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.032732 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.033263 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.034648 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.043864 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.044785 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mbql2"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.049861 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.063984 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.064269 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.064535 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.065245 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.065596 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.065777 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.066174 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ng4qc"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.066424 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.066693 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.066725 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.067070 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.067096 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.067188 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.067564 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.069282 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.069608 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.069636 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.069851 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.069914 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.070087 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.070247 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.070279 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.070334 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.070569 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.071248 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.071310 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.073347 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lxp5s"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.074132 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.079060 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-zz6cp"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.079814 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-b527d"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.080438 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.082613 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.083203 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zz6cp" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.084588 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.084743 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.084912 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.085160 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.085347 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.085467 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-drngn"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.085947 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.086053 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.086098 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.086262 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.086454 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.086739 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.086940 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.087157 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.088707 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.088953 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.089084 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.093303 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.094091 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.094355 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.095642 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.095687 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.095799 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.096118 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.096463 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.096644 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.096651 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.097117 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.098704 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.098908 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.099037 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.099150 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.099183 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.099207 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-knjbc"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.099153 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.099291 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.099366 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.099430 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.099721 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.099733 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.099846 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.099941 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.099987 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.100116 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.100683 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.100947 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l2v4w"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.101548 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.103516 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.103657 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.103804 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zkb6c"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.103828 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.104535 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.108612 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.111627 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.156300 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.156591 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.156729 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.156757 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.157233 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.158456 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.158752 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.159171 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.159240 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.159440 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.159464 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.159526 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.159642 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.159734 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160122 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160232 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160275 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160287 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160325 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160330 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160371 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160416 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160332 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160458 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160461 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160504 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160533 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.160670 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.163509 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.167537 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.164731 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.164827 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.164923 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.167079 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.167147 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.167201 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.167142 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-z4bhx"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.170267 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qlml2"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.174653 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.170720 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z4bhx" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.175008 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.174161 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f62b6206-f14c-4f4d-a1c1-af09036abdcf-serving-cert\") pod \"openshift-config-operator-7777fb866f-knjbc\" (UID: \"f62b6206-f14c-4f4d-a1c1-af09036abdcf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176096 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c907e960-f833-4546-89df-491334c4fe72-console-serving-cert\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.175309 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.175352 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.175560 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176122 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc591dda-88eb-4eff-be2b-dfc142b4aa50-apiservice-cert\") pod \"packageserver-d55dfcdfc-9tvd2\" (UID: \"bc591dda-88eb-4eff-be2b-dfc142b4aa50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176239 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176266 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9eef0925-d18a-4d34-9301-9cf5f900a39e-serving-cert\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.175638 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176327 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176478 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b20d3e-89d8-4ff9-b56b-611728f913ee-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176506 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176543 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176580 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f62b6206-f14c-4f4d-a1c1-af09036abdcf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-knjbc\" (UID: \"f62b6206-f14c-4f4d-a1c1-af09036abdcf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176636 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176656 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9eef0925-d18a-4d34-9301-9cf5f900a39e-etcd-client\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176732 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-image-import-ca\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176772 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djhbf\" (UniqueName: \"kubernetes.io/projected/77e59ae9-2ed1-4c42-a17b-95c677bac560-kube-api-access-djhbf\") pod \"cluster-image-registry-operator-dc59b4c8b-pm6nw\" (UID: \"77e59ae9-2ed1-4c42-a17b-95c677bac560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176795 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc591dda-88eb-4eff-be2b-dfc142b4aa50-webhook-cert\") pod \"packageserver-d55dfcdfc-9tvd2\" (UID: \"bc591dda-88eb-4eff-be2b-dfc142b4aa50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176804 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mkp6d"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176816 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176834 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1024faa3-55d0-47a5-ad2e-745ec92c0c89-serving-cert\") pod \"route-controller-manager-6576b87f9c-b9wzm\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176850 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77e59ae9-2ed1-4c42-a17b-95c677bac560-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pm6nw\" (UID: \"77e59ae9-2ed1-4c42-a17b-95c677bac560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176866 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aabe6734-63b5-412a-80f3-c07b3a9b3071-etcd-client\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176885 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9xwt\" (UniqueName: \"kubernetes.io/projected/1024faa3-55d0-47a5-ad2e-745ec92c0c89-kube-api-access-v9xwt\") pod \"route-controller-manager-6576b87f9c-b9wzm\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176889 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176900 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176916 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4sv7\" (UniqueName: \"kubernetes.io/projected/a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d-kube-api-access-t4sv7\") pod \"openshift-apiserver-operator-796bbdcf4f-7cvqt\" (UID: \"a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176934 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd7f3d80-238e-4e01-8d1a-4ee23eb29230-config\") pod \"console-operator-58897d9998-zkb6c\" (UID: \"cd7f3d80-238e-4e01-8d1a-4ee23eb29230\") " pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176968 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9eef0925-d18a-4d34-9301-9cf5f900a39e-encryption-config\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.176984 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bc591dda-88eb-4eff-be2b-dfc142b4aa50-tmpfs\") pod \"packageserver-d55dfcdfc-9tvd2\" (UID: \"bc591dda-88eb-4eff-be2b-dfc142b4aa50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177000 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmkc8\" (UniqueName: \"kubernetes.io/projected/855d1329-52b5-4c28-bef3-b18cb2a5e33e-kube-api-access-fmkc8\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177018 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tz88\" (UniqueName: \"kubernetes.io/projected/7b5e6777-79ed-4101-9259-38715fc413c6-kube-api-access-5tz88\") pod \"machine-approver-56656f9798-b527d\" (UID: \"7b5e6777-79ed-4101-9259-38715fc413c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177034 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-console-config\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177056 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-service-ca\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177101 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177144 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177173 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7b5e6777-79ed-4101-9259-38715fc413c6-machine-approver-tls\") pod \"machine-approver-56656f9798-b527d\" (UID: \"7b5e6777-79ed-4101-9259-38715fc413c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177199 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4b5h7"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177200 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/aabe6734-63b5-412a-80f3-c07b3a9b3071-encryption-config\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177249 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-config\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177267 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-etcd-serving-ca\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177286 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr7b4\" (UniqueName: \"kubernetes.io/projected/10126253-d0e8-4f30-9047-1780a718e251-kube-api-access-fr7b4\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177304 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/77e59ae9-2ed1-4c42-a17b-95c677bac560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pm6nw\" (UID: \"77e59ae9-2ed1-4c42-a17b-95c677bac560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177312 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mkp6d" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177333 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd7f3d80-238e-4e01-8d1a-4ee23eb29230-serving-cert\") pod \"console-operator-58897d9998-zkb6c\" (UID: \"cd7f3d80-238e-4e01-8d1a-4ee23eb29230\") " pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.177777 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gvtnh"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.178070 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.178099 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.178237 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4b5h7" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.182201 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.182236 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk5nl\" (UniqueName: \"kubernetes.io/projected/03b20d3e-89d8-4ff9-b56b-611728f913ee-kube-api-access-sk5nl\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.182257 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/39618657-35da-4567-a8f8-7f53bd12b8be-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fnlv9\" (UID: \"39618657-35da-4567-a8f8-7f53bd12b8be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.182318 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-audit-policies\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.182395 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-oauth-serving-cert\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.182449 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7-config\") pod \"machine-api-operator-5694c8668f-mbql2\" (UID: \"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186590 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186646 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9zzv\" (UniqueName: \"kubernetes.io/projected/af75a1fc-4c7b-4f3b-bc50-9d0fd4d7f52f-kube-api-access-z9zzv\") pod \"downloads-7954f5f757-zz6cp\" (UID: \"af75a1fc-4c7b-4f3b-bc50-9d0fd4d7f52f\") " pod="openshift-console/downloads-7954f5f757-zz6cp" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186689 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/855d1329-52b5-4c28-bef3-b18cb2a5e33e-audit-dir\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186709 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7cvqt\" (UID: \"a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186727 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b20d3e-89d8-4ff9-b56b-611728f913ee-service-ca-bundle\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186745 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-client-ca\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186762 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mbql2\" (UID: \"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186137 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186827 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hlf6\" (UniqueName: \"kubernetes.io/projected/9eef0925-d18a-4d34-9301-9cf5f900a39e-kube-api-access-4hlf6\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186849 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7cvqt\" (UID: \"a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186871 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7-images\") pod \"machine-api-operator-5694c8668f-mbql2\" (UID: \"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186891 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aabe6734-63b5-412a-80f3-c07b3a9b3071-audit-dir\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186909 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186953 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a8ab13b-92c4-40b4-93fa-3bde54f728d6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c8v74\" (UID: \"2a8ab13b-92c4-40b4-93fa-3bde54f728d6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.186970 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-audit\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187044 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187134 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/aabe6734-63b5-412a-80f3-c07b3a9b3071-node-pullsecrets\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187164 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10126253-d0e8-4f30-9047-1780a718e251-serving-cert\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187183 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mswth\" (UniqueName: \"kubernetes.io/projected/c907e960-f833-4546-89df-491334c4fe72-kube-api-access-mswth\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187204 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h4tl\" (UniqueName: \"kubernetes.io/projected/bc591dda-88eb-4eff-be2b-dfc142b4aa50-kube-api-access-6h4tl\") pod \"packageserver-d55dfcdfc-9tvd2\" (UID: \"bc591dda-88eb-4eff-be2b-dfc142b4aa50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187224 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1024faa3-55d0-47a5-ad2e-745ec92c0c89-config\") pod \"route-controller-manager-6576b87f9c-b9wzm\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187244 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eef0925-d18a-4d34-9301-9cf5f900a39e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187264 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7958v\" (UniqueName: \"kubernetes.io/projected/39618657-35da-4567-a8f8-7f53bd12b8be-kube-api-access-7958v\") pod \"cluster-samples-operator-665b6dd947-fnlv9\" (UID: \"39618657-35da-4567-a8f8-7f53bd12b8be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187304 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-trusted-ca-bundle\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187326 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p762\" (UniqueName: \"kubernetes.io/projected/cd7f3d80-238e-4e01-8d1a-4ee23eb29230-kube-api-access-8p762\") pod \"console-operator-58897d9998-zkb6c\" (UID: \"cd7f3d80-238e-4e01-8d1a-4ee23eb29230\") " pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187346 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aabe6734-63b5-412a-80f3-c07b3a9b3071-serving-cert\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187413 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b20d3e-89d8-4ff9-b56b-611728f913ee-serving-cert\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187436 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b5e6777-79ed-4101-9259-38715fc413c6-auth-proxy-config\") pod \"machine-approver-56656f9798-b527d\" (UID: \"7b5e6777-79ed-4101-9259-38715fc413c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187455 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187478 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrknr\" (UniqueName: \"kubernetes.io/projected/f62b6206-f14c-4f4d-a1c1-af09036abdcf-kube-api-access-rrknr\") pod \"openshift-config-operator-7777fb866f-knjbc\" (UID: \"f62b6206-f14c-4f4d-a1c1-af09036abdcf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187503 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c907e960-f833-4546-89df-491334c4fe72-console-oauth-config\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187581 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5e6777-79ed-4101-9259-38715fc413c6-config\") pod \"machine-approver-56656f9798-b527d\" (UID: \"7b5e6777-79ed-4101-9259-38715fc413c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187603 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd7f3d80-238e-4e01-8d1a-4ee23eb29230-trusted-ca\") pod \"console-operator-58897d9998-zkb6c\" (UID: \"cd7f3d80-238e-4e01-8d1a-4ee23eb29230\") " pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187619 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1024faa3-55d0-47a5-ad2e-745ec92c0c89-client-ca\") pod \"route-controller-manager-6576b87f9c-b9wzm\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187638 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77e59ae9-2ed1-4c42-a17b-95c677bac560-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pm6nw\" (UID: \"77e59ae9-2ed1-4c42-a17b-95c677bac560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187667 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg6cr\" (UniqueName: \"kubernetes.io/projected/2a8ab13b-92c4-40b4-93fa-3bde54f728d6-kube-api-access-bg6cr\") pod \"openshift-controller-manager-operator-756b6f6bc6-c8v74\" (UID: \"2a8ab13b-92c4-40b4-93fa-3bde54f728d6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187684 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lncqm\" (UniqueName: \"kubernetes.io/projected/0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7-kube-api-access-lncqm\") pod \"machine-api-operator-5694c8668f-mbql2\" (UID: \"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187717 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b20d3e-89d8-4ff9-b56b-611728f913ee-config\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187734 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqnn6\" (UniqueName: \"kubernetes.io/projected/aabe6734-63b5-412a-80f3-c07b3a9b3071-kube-api-access-mqnn6\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187761 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9eef0925-d18a-4d34-9301-9cf5f900a39e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187777 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8ab13b-92c4-40b4-93fa-3bde54f728d6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c8v74\" (UID: \"2a8ab13b-92c4-40b4-93fa-3bde54f728d6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187812 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9eef0925-d18a-4d34-9301-9cf5f900a39e-audit-dir\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187829 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-config\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.187862 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9eef0925-d18a-4d34-9301-9cf5f900a39e-audit-policies\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.188564 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t6hrd"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.189746 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.202562 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.203123 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.203306 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.204202 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.207668 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.208936 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.210763 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.212924 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dgzzm"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.221559 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.223488 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.224129 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.224197 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.224536 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.225474 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-ckz9k"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.226146 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.226168 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gm6vp"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.226939 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.227142 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.227785 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.229186 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.229325 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.229724 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.230333 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.230799 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.232073 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.232570 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.232604 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.233418 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.233964 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.235454 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.236578 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m5dvh"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.237406 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-m5dvh" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.237653 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2tbqk"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.238971 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-drngn"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.241844 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4b5h7"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.244004 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.244937 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zkb6c"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.245908 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-knjbc"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.248904 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ng4qc"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.249919 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.252629 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zz6cp"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.253860 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.254125 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.255198 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dgzzm"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.257099 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t6hrd"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.257958 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.258928 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lxp5s"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.260110 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.262102 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.263088 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-csjj4"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.263686 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-csjj4" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.264527 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-s2ws6"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.265039 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s2ws6" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.266159 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m5dvh"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.267583 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.268651 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.269669 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gvtnh"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.269758 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.270680 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z4bhx"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.272230 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.273163 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.274498 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mbql2"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.275479 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qlml2"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.276964 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.278996 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l2v4w"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.279168 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mkp6d"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.280151 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.288798 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.288969 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9zzv\" (UniqueName: \"kubernetes.io/projected/af75a1fc-4c7b-4f3b-bc50-9d0fd4d7f52f-kube-api-access-z9zzv\") pod \"downloads-7954f5f757-zz6cp\" (UID: \"af75a1fc-4c7b-4f3b-bc50-9d0fd4d7f52f\") " pod="openshift-console/downloads-7954f5f757-zz6cp" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289017 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d46b073-f023-4090-a6ec-4916356b1e4d-service-ca-bundle\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289063 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/855d1329-52b5-4c28-bef3-b18cb2a5e33e-audit-dir\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289303 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7cvqt\" (UID: \"a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289367 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/855d1329-52b5-4c28-bef3-b18cb2a5e33e-audit-dir\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289364 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b20d3e-89d8-4ff9-b56b-611728f913ee-service-ca-bundle\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289452 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6d46b073-f023-4090-a6ec-4916356b1e4d-default-certificate\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289481 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-client-ca\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289512 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mbql2\" (UID: \"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289530 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289543 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hlf6\" (UniqueName: \"kubernetes.io/projected/9eef0925-d18a-4d34-9301-9cf5f900a39e-kube-api-access-4hlf6\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289640 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7cvqt\" (UID: \"a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289671 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7-images\") pod \"machine-api-operator-5694c8668f-mbql2\" (UID: \"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289696 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aabe6734-63b5-412a-80f3-c07b3a9b3071-audit-dir\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289723 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289751 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a8ab13b-92c4-40b4-93fa-3bde54f728d6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c8v74\" (UID: \"2a8ab13b-92c4-40b4-93fa-3bde54f728d6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289773 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-audit\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289795 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289828 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/aabe6734-63b5-412a-80f3-c07b3a9b3071-node-pullsecrets\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289848 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10126253-d0e8-4f30-9047-1780a718e251-serving-cert\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289871 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mswth\" (UniqueName: \"kubernetes.io/projected/c907e960-f833-4546-89df-491334c4fe72-kube-api-access-mswth\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289892 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h4tl\" (UniqueName: \"kubernetes.io/projected/bc591dda-88eb-4eff-be2b-dfc142b4aa50-kube-api-access-6h4tl\") pod \"packageserver-d55dfcdfc-9tvd2\" (UID: \"bc591dda-88eb-4eff-be2b-dfc142b4aa50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289915 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1024faa3-55d0-47a5-ad2e-745ec92c0c89-config\") pod \"route-controller-manager-6576b87f9c-b9wzm\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289940 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eef0925-d18a-4d34-9301-9cf5f900a39e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289961 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7958v\" (UniqueName: \"kubernetes.io/projected/39618657-35da-4567-a8f8-7f53bd12b8be-kube-api-access-7958v\") pod \"cluster-samples-operator-665b6dd947-fnlv9\" (UID: \"39618657-35da-4567-a8f8-7f53bd12b8be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289975 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.290481 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7cvqt\" (UID: \"a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.290520 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b20d3e-89d8-4ff9-b56b-611728f913ee-service-ca-bundle\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.290573 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aabe6734-63b5-412a-80f3-c07b3a9b3071-audit-dir\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.291195 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7-images\") pod \"machine-api-operator-5694c8668f-mbql2\" (UID: \"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.289986 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-trusted-ca-bundle\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.291781 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p762\" (UniqueName: \"kubernetes.io/projected/cd7f3d80-238e-4e01-8d1a-4ee23eb29230-kube-api-access-8p762\") pod \"console-operator-58897d9998-zkb6c\" (UID: \"cd7f3d80-238e-4e01-8d1a-4ee23eb29230\") " pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.291810 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-audit\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.291823 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aabe6734-63b5-412a-80f3-c07b3a9b3071-serving-cert\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.291681 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-trusted-ca-bundle\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.291681 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-client-ca\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.292043 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.292102 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b20d3e-89d8-4ff9-b56b-611728f913ee-serving-cert\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.292130 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b5e6777-79ed-4101-9259-38715fc413c6-auth-proxy-config\") pod \"machine-approver-56656f9798-b527d\" (UID: \"7b5e6777-79ed-4101-9259-38715fc413c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.292702 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/aabe6734-63b5-412a-80f3-c07b3a9b3071-node-pullsecrets\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.292714 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9eef0925-d18a-4d34-9301-9cf5f900a39e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.292748 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.292781 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrknr\" (UniqueName: \"kubernetes.io/projected/f62b6206-f14c-4f4d-a1c1-af09036abdcf-kube-api-access-rrknr\") pod \"openshift-config-operator-7777fb866f-knjbc\" (UID: \"f62b6206-f14c-4f4d-a1c1-af09036abdcf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.292849 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c907e960-f833-4546-89df-491334c4fe72-console-oauth-config\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.292977 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5e6777-79ed-4101-9259-38715fc413c6-config\") pod \"machine-approver-56656f9798-b527d\" (UID: \"7b5e6777-79ed-4101-9259-38715fc413c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293018 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd7f3d80-238e-4e01-8d1a-4ee23eb29230-trusted-ca\") pod \"console-operator-58897d9998-zkb6c\" (UID: \"cd7f3d80-238e-4e01-8d1a-4ee23eb29230\") " pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293043 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1024faa3-55d0-47a5-ad2e-745ec92c0c89-client-ca\") pod \"route-controller-manager-6576b87f9c-b9wzm\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293070 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77e59ae9-2ed1-4c42-a17b-95c677bac560-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pm6nw\" (UID: \"77e59ae9-2ed1-4c42-a17b-95c677bac560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293106 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg6cr\" (UniqueName: \"kubernetes.io/projected/2a8ab13b-92c4-40b4-93fa-3bde54f728d6-kube-api-access-bg6cr\") pod \"openshift-controller-manager-operator-756b6f6bc6-c8v74\" (UID: \"2a8ab13b-92c4-40b4-93fa-3bde54f728d6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293130 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lncqm\" (UniqueName: \"kubernetes.io/projected/0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7-kube-api-access-lncqm\") pod \"machine-api-operator-5694c8668f-mbql2\" (UID: \"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293158 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b20d3e-89d8-4ff9-b56b-611728f913ee-config\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293175 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqnn6\" (UniqueName: \"kubernetes.io/projected/aabe6734-63b5-412a-80f3-c07b3a9b3071-kube-api-access-mqnn6\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293199 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9eef0925-d18a-4d34-9301-9cf5f900a39e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293221 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8ab13b-92c4-40b4-93fa-3bde54f728d6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c8v74\" (UID: \"2a8ab13b-92c4-40b4-93fa-3bde54f728d6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293256 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9eef0925-d18a-4d34-9301-9cf5f900a39e-audit-dir\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293276 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-config\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293299 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9eef0925-d18a-4d34-9301-9cf5f900a39e-audit-policies\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293332 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntwk2\" (UniqueName: \"kubernetes.io/projected/6d46b073-f023-4090-a6ec-4916356b1e4d-kube-api-access-ntwk2\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293361 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f62b6206-f14c-4f4d-a1c1-af09036abdcf-serving-cert\") pod \"openshift-config-operator-7777fb866f-knjbc\" (UID: \"f62b6206-f14c-4f4d-a1c1-af09036abdcf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293400 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c907e960-f833-4546-89df-491334c4fe72-console-serving-cert\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293529 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b5e6777-79ed-4101-9259-38715fc413c6-auth-proxy-config\") pod \"machine-approver-56656f9798-b527d\" (UID: \"7b5e6777-79ed-4101-9259-38715fc413c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293420 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc591dda-88eb-4eff-be2b-dfc142b4aa50-apiservice-cert\") pod \"packageserver-d55dfcdfc-9tvd2\" (UID: \"bc591dda-88eb-4eff-be2b-dfc142b4aa50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293853 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293864 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293890 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9eef0925-d18a-4d34-9301-9cf5f900a39e-serving-cert\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293916 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b20d3e-89d8-4ff9-b56b-611728f913ee-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.293950 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294057 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294151 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f62b6206-f14c-4f4d-a1c1-af09036abdcf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-knjbc\" (UID: \"f62b6206-f14c-4f4d-a1c1-af09036abdcf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294207 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294216 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5e6777-79ed-4101-9259-38715fc413c6-config\") pod \"machine-approver-56656f9798-b527d\" (UID: \"7b5e6777-79ed-4101-9259-38715fc413c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294244 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9eef0925-d18a-4d34-9301-9cf5f900a39e-etcd-client\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294294 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-image-import-ca\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294326 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djhbf\" (UniqueName: \"kubernetes.io/projected/77e59ae9-2ed1-4c42-a17b-95c677bac560-kube-api-access-djhbf\") pod \"cluster-image-registry-operator-dc59b4c8b-pm6nw\" (UID: \"77e59ae9-2ed1-4c42-a17b-95c677bac560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294368 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc591dda-88eb-4eff-be2b-dfc142b4aa50-webhook-cert\") pod \"packageserver-d55dfcdfc-9tvd2\" (UID: \"bc591dda-88eb-4eff-be2b-dfc142b4aa50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294439 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294488 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1024faa3-55d0-47a5-ad2e-745ec92c0c89-serving-cert\") pod \"route-controller-manager-6576b87f9c-b9wzm\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294589 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77e59ae9-2ed1-4c42-a17b-95c677bac560-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pm6nw\" (UID: \"77e59ae9-2ed1-4c42-a17b-95c677bac560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294656 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aabe6734-63b5-412a-80f3-c07b3a9b3071-etcd-client\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294692 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9xwt\" (UniqueName: \"kubernetes.io/projected/1024faa3-55d0-47a5-ad2e-745ec92c0c89-kube-api-access-v9xwt\") pod \"route-controller-manager-6576b87f9c-b9wzm\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295263 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9eef0925-d18a-4d34-9301-9cf5f900a39e-audit-policies\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294732 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295337 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4sv7\" (UniqueName: \"kubernetes.io/projected/a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d-kube-api-access-t4sv7\") pod \"openshift-apiserver-operator-796bbdcf4f-7cvqt\" (UID: \"a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295369 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd7f3d80-238e-4e01-8d1a-4ee23eb29230-config\") pod \"console-operator-58897d9998-zkb6c\" (UID: \"cd7f3d80-238e-4e01-8d1a-4ee23eb29230\") " pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295424 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9eef0925-d18a-4d34-9301-9cf5f900a39e-encryption-config\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295449 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bc591dda-88eb-4eff-be2b-dfc142b4aa50-tmpfs\") pod \"packageserver-d55dfcdfc-9tvd2\" (UID: \"bc591dda-88eb-4eff-be2b-dfc142b4aa50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295471 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmkc8\" (UniqueName: \"kubernetes.io/projected/855d1329-52b5-4c28-bef3-b18cb2a5e33e-kube-api-access-fmkc8\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295510 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tz88\" (UniqueName: \"kubernetes.io/projected/7b5e6777-79ed-4101-9259-38715fc413c6-kube-api-access-5tz88\") pod \"machine-approver-56656f9798-b527d\" (UID: \"7b5e6777-79ed-4101-9259-38715fc413c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295537 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-console-config\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295565 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-service-ca\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295592 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295621 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6d46b073-f023-4090-a6ec-4916356b1e4d-stats-auth\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295648 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295648 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9eef0925-d18a-4d34-9301-9cf5f900a39e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295671 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7b5e6777-79ed-4101-9259-38715fc413c6-machine-approver-tls\") pod \"machine-approver-56656f9798-b527d\" (UID: \"7b5e6777-79ed-4101-9259-38715fc413c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295733 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/aabe6734-63b5-412a-80f3-c07b3a9b3071-encryption-config\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.296293 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b20d3e-89d8-4ff9-b56b-611728f913ee-config\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.296421 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-config\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.296670 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.297049 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-config\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.296696 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd7f3d80-238e-4e01-8d1a-4ee23eb29230-trusted-ca\") pod \"console-operator-58897d9998-zkb6c\" (UID: \"cd7f3d80-238e-4e01-8d1a-4ee23eb29230\") " pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.296775 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1024faa3-55d0-47a5-ad2e-745ec92c0c89-client-ca\") pod \"route-controller-manager-6576b87f9c-b9wzm\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.296899 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f62b6206-f14c-4f4d-a1c1-af09036abdcf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-knjbc\" (UID: \"f62b6206-f14c-4f4d-a1c1-af09036abdcf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294151 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9eef0925-d18a-4d34-9301-9cf5f900a39e-audit-dir\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.296967 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77e59ae9-2ed1-4c42-a17b-95c677bac560-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pm6nw\" (UID: \"77e59ae9-2ed1-4c42-a17b-95c677bac560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.295773 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-config\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.297239 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-etcd-serving-ca\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.297503 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr7b4\" (UniqueName: \"kubernetes.io/projected/10126253-d0e8-4f30-9047-1780a718e251-kube-api-access-fr7b4\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.297534 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/77e59ae9-2ed1-4c42-a17b-95c677bac560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pm6nw\" (UID: \"77e59ae9-2ed1-4c42-a17b-95c677bac560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.297588 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd7f3d80-238e-4e01-8d1a-4ee23eb29230-serving-cert\") pod \"console-operator-58897d9998-zkb6c\" (UID: \"cd7f3d80-238e-4e01-8d1a-4ee23eb29230\") " pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.297617 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.297659 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk5nl\" (UniqueName: \"kubernetes.io/projected/03b20d3e-89d8-4ff9-b56b-611728f913ee-kube-api-access-sk5nl\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.297702 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/39618657-35da-4567-a8f8-7f53bd12b8be-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fnlv9\" (UID: \"39618657-35da-4567-a8f8-7f53bd12b8be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.297737 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d46b073-f023-4090-a6ec-4916356b1e4d-metrics-certs\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.297771 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-audit-policies\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.297797 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-oauth-serving-cert\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.297832 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7-config\") pod \"machine-api-operator-5694c8668f-mbql2\" (UID: \"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.298087 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd7f3d80-238e-4e01-8d1a-4ee23eb29230-config\") pod \"console-operator-58897d9998-zkb6c\" (UID: \"cd7f3d80-238e-4e01-8d1a-4ee23eb29230\") " pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.298189 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-console-config\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.298506 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.294374 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1024faa3-55d0-47a5-ad2e-745ec92c0c89-config\") pod \"route-controller-manager-6576b87f9c-b9wzm\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.299130 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aabe6734-63b5-412a-80f3-c07b3a9b3071-serving-cert\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.299169 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bc591dda-88eb-4eff-be2b-dfc142b4aa50-tmpfs\") pod \"packageserver-d55dfcdfc-9tvd2\" (UID: \"bc591dda-88eb-4eff-be2b-dfc142b4aa50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.299325 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a8ab13b-92c4-40b4-93fa-3bde54f728d6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c8v74\" (UID: \"2a8ab13b-92c4-40b4-93fa-3bde54f728d6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.299520 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b20d3e-89d8-4ff9-b56b-611728f913ee-serving-cert\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.299566 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.299733 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9eef0925-d18a-4d34-9301-9cf5f900a39e-encryption-config\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.299807 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f62b6206-f14c-4f4d-a1c1-af09036abdcf-serving-cert\") pod \"openshift-config-operator-7777fb866f-knjbc\" (UID: \"f62b6206-f14c-4f4d-a1c1-af09036abdcf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.299909 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8ab13b-92c4-40b4-93fa-3bde54f728d6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c8v74\" (UID: \"2a8ab13b-92c4-40b4-93fa-3bde54f728d6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.299994 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mbql2\" (UID: \"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.300172 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7cvqt\" (UID: \"a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.302645 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.302750 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c907e960-f833-4546-89df-491334c4fe72-console-oauth-config\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.303266 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.303858 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.304738 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/39618657-35da-4567-a8f8-7f53bd12b8be-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fnlv9\" (UID: \"39618657-35da-4567-a8f8-7f53bd12b8be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.305166 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1024faa3-55d0-47a5-ad2e-745ec92c0c89-serving-cert\") pod \"route-controller-manager-6576b87f9c-b9wzm\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.305192 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s2ws6"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.306085 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.306016 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7b5e6777-79ed-4101-9259-38715fc413c6-machine-approver-tls\") pod \"machine-approver-56656f9798-b527d\" (UID: \"7b5e6777-79ed-4101-9259-38715fc413c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.307091 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-etcd-serving-ca\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.307520 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.298745 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7-config\") pod \"machine-api-operator-5694c8668f-mbql2\" (UID: \"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.308145 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.308599 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/aabe6734-63b5-412a-80f3-c07b3a9b3071-image-import-ca\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.309174 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc591dda-88eb-4eff-be2b-dfc142b4aa50-apiservice-cert\") pod \"packageserver-d55dfcdfc-9tvd2\" (UID: \"bc591dda-88eb-4eff-be2b-dfc142b4aa50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.309346 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-oauth-serving-cert\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.309448 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.309528 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.301438 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10126253-d0e8-4f30-9047-1780a718e251-serving-cert\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.309648 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-service-ca\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.309726 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.309804 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c907e960-f833-4546-89df-491334c4fe72-console-serving-cert\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.310745 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.310829 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/77e59ae9-2ed1-4c42-a17b-95c677bac560-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pm6nw\" (UID: \"77e59ae9-2ed1-4c42-a17b-95c677bac560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.310853 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9eef0925-d18a-4d34-9301-9cf5f900a39e-etcd-client\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.311173 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.311455 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/aabe6734-63b5-412a-80f3-c07b3a9b3071-encryption-config\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.311518 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd7f3d80-238e-4e01-8d1a-4ee23eb29230-serving-cert\") pod \"console-operator-58897d9998-zkb6c\" (UID: \"cd7f3d80-238e-4e01-8d1a-4ee23eb29230\") " pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.311854 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aabe6734-63b5-412a-80f3-c07b3a9b3071-etcd-client\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.312057 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.312074 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-audit-policies\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.312398 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.313236 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc591dda-88eb-4eff-be2b-dfc142b4aa50-webhook-cert\") pod \"packageserver-d55dfcdfc-9tvd2\" (UID: \"bc591dda-88eb-4eff-be2b-dfc142b4aa50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.313487 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gm6vp"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.314543 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fwfk5"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.315855 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.316230 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b20d3e-89d8-4ff9-b56b-611728f913ee-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.316754 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fwfk5"] Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.317118 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9eef0925-d18a-4d34-9301-9cf5f900a39e-serving-cert\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.330338 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.351192 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.369602 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.389692 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.398939 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntwk2\" (UniqueName: \"kubernetes.io/projected/6d46b073-f023-4090-a6ec-4916356b1e4d-kube-api-access-ntwk2\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.399030 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6d46b073-f023-4090-a6ec-4916356b1e4d-stats-auth\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.399088 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d46b073-f023-4090-a6ec-4916356b1e4d-metrics-certs\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.399234 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d46b073-f023-4090-a6ec-4916356b1e4d-service-ca-bundle\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.399268 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6d46b073-f023-4090-a6ec-4916356b1e4d-default-certificate\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.410071 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.429054 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.449509 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.469659 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.490435 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.509471 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.529660 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.549876 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.570963 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.590239 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.610381 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.642520 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.649799 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.670545 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.689752 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.710066 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.730859 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.750009 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.769440 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.790138 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.809596 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.829237 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.850068 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.870661 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.872025 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.872095 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.890116 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.909619 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.930863 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.950028 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.970205 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 15:03:19 crc kubenswrapper[4931]: I1201 15:03:19.990922 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.011199 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.030126 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.049778 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.070836 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.090328 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.110690 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.110922 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 15:03:20 crc kubenswrapper[4931]: E1201 15:03:20.110942 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:05:22.110883368 +0000 UTC m=+268.536757045 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.111050 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.111313 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.111441 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.111626 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.112471 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.117162 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.117711 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.119452 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.131223 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.149748 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.166248 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.170728 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.179423 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.191122 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.211713 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.228591 4931 request.go:700] Waited for 1.003622806s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/secrets?fieldSelector=metadata.name%3Dkube-scheduler-operator-serving-cert&limit=500&resourceVersion=0 Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.230884 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.251424 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.270451 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.277645 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.290135 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.311778 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.330915 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.344834 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6d46b073-f023-4090-a6ec-4916356b1e4d-default-certificate\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.349996 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.362114 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6d46b073-f023-4090-a6ec-4916356b1e4d-stats-auth\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.370765 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.382913 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d46b073-f023-4090-a6ec-4916356b1e4d-metrics-certs\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.391458 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.400632 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d46b073-f023-4090-a6ec-4916356b1e4d-service-ca-bundle\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.411130 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.431672 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.450420 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.470214 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.492910 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.510795 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 15:03:20 crc kubenswrapper[4931]: W1201 15:03:20.519700 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-4ebb7353c6f18e680c17d0600c73dc039d63b6da8ccac3f95a36a01b0bad80a7 WatchSource:0}: Error finding container 4ebb7353c6f18e680c17d0600c73dc039d63b6da8ccac3f95a36a01b0bad80a7: Status 404 returned error can't find the container with id 4ebb7353c6f18e680c17d0600c73dc039d63b6da8ccac3f95a36a01b0bad80a7 Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.530375 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.558188 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.571327 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.590360 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.611790 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 15:03:20 crc kubenswrapper[4931]: W1201 15:03:20.628264 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-0979fd1af3e560a2d88ebac017fe31646d2bdce2624cc3d70329f4559329fbec WatchSource:0}: Error finding container 0979fd1af3e560a2d88ebac017fe31646d2bdce2624cc3d70329f4559329fbec: Status 404 returned error can't find the container with id 0979fd1af3e560a2d88ebac017fe31646d2bdce2624cc3d70329f4559329fbec Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.629638 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.651084 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.671364 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.690814 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.711208 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.732621 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.751813 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.771630 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.790104 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.810735 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.831964 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.849691 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.870613 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.930034 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.950485 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.971530 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 15:03:20 crc kubenswrapper[4931]: I1201 15:03:20.989986 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.011911 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.031107 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.040650 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ea44d6713ecb5465184086c619db7f8117e721801fbc9ad73755e624abd7d537"} Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.040738 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0979fd1af3e560a2d88ebac017fe31646d2bdce2624cc3d70329f4559329fbec"} Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.042955 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7260473ab1d2be597b0c256b6e16e00cb5d6f664c6939a08135978f9f46c3263"} Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.043321 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4ebb7353c6f18e680c17d0600c73dc039d63b6da8ccac3f95a36a01b0bad80a7"} Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.046170 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"bed13ed71d40ca166c501595328c6257314e053c962282ae2d99513103c52b3f"} Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.046430 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6fba6dcff5f3f6929d0a0b52de65d7e9413306836f21a3266d895694c21932c3"} Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.046845 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.085011 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9zzv\" (UniqueName: \"kubernetes.io/projected/af75a1fc-4c7b-4f3b-bc50-9d0fd4d7f52f-kube-api-access-z9zzv\") pod \"downloads-7954f5f757-zz6cp\" (UID: \"af75a1fc-4c7b-4f3b-bc50-9d0fd4d7f52f\") " pod="openshift-console/downloads-7954f5f757-zz6cp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.102483 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hlf6\" (UniqueName: \"kubernetes.io/projected/9eef0925-d18a-4d34-9301-9cf5f900a39e-kube-api-access-4hlf6\") pod \"apiserver-7bbb656c7d-p2z4b\" (UID: \"9eef0925-d18a-4d34-9301-9cf5f900a39e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.107004 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mswth\" (UniqueName: \"kubernetes.io/projected/c907e960-f833-4546-89df-491334c4fe72-kube-api-access-mswth\") pod \"console-f9d7485db-drngn\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.141465 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h4tl\" (UniqueName: \"kubernetes.io/projected/bc591dda-88eb-4eff-be2b-dfc142b4aa50-kube-api-access-6h4tl\") pod \"packageserver-d55dfcdfc-9tvd2\" (UID: \"bc591dda-88eb-4eff-be2b-dfc142b4aa50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.158666 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.163682 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7958v\" (UniqueName: \"kubernetes.io/projected/39618657-35da-4567-a8f8-7f53bd12b8be-kube-api-access-7958v\") pod \"cluster-samples-operator-665b6dd947-fnlv9\" (UID: \"39618657-35da-4567-a8f8-7f53bd12b8be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.179737 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p762\" (UniqueName: \"kubernetes.io/projected/cd7f3d80-238e-4e01-8d1a-4ee23eb29230-kube-api-access-8p762\") pod \"console-operator-58897d9998-zkb6c\" (UID: \"cd7f3d80-238e-4e01-8d1a-4ee23eb29230\") " pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.201571 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqnn6\" (UniqueName: \"kubernetes.io/projected/aabe6734-63b5-412a-80f3-c07b3a9b3071-kube-api-access-mqnn6\") pod \"apiserver-76f77b778f-2tbqk\" (UID: \"aabe6734-63b5-412a-80f3-c07b3a9b3071\") " pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.224341 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg6cr\" (UniqueName: \"kubernetes.io/projected/2a8ab13b-92c4-40b4-93fa-3bde54f728d6-kube-api-access-bg6cr\") pod \"openshift-controller-manager-operator-756b6f6bc6-c8v74\" (UID: \"2a8ab13b-92c4-40b4-93fa-3bde54f728d6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.225962 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.257626 4931 request.go:700] Waited for 1.96107544s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.260355 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77e59ae9-2ed1-4c42-a17b-95c677bac560-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pm6nw\" (UID: \"77e59ae9-2ed1-4c42-a17b-95c677bac560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.273087 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lncqm\" (UniqueName: \"kubernetes.io/projected/0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7-kube-api-access-lncqm\") pod \"machine-api-operator-5694c8668f-mbql2\" (UID: \"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.283273 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4sv7\" (UniqueName: \"kubernetes.io/projected/a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d-kube-api-access-t4sv7\") pod \"openshift-apiserver-operator-796bbdcf4f-7cvqt\" (UID: \"a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.285888 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.316712 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrknr\" (UniqueName: \"kubernetes.io/projected/f62b6206-f14c-4f4d-a1c1-af09036abdcf-kube-api-access-rrknr\") pod \"openshift-config-operator-7777fb866f-knjbc\" (UID: \"f62b6206-f14c-4f4d-a1c1-af09036abdcf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.332565 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.342950 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.346277 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmkc8\" (UniqueName: \"kubernetes.io/projected/855d1329-52b5-4c28-bef3-b18cb2a5e33e-kube-api-access-fmkc8\") pod \"oauth-openshift-558db77b4-l2v4w\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.347011 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tz88\" (UniqueName: \"kubernetes.io/projected/7b5e6777-79ed-4101-9259-38715fc413c6-kube-api-access-5tz88\") pod \"machine-approver-56656f9798-b527d\" (UID: \"7b5e6777-79ed-4101-9259-38715fc413c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.351433 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zz6cp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.356859 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.362494 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.362716 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9xwt\" (UniqueName: \"kubernetes.io/projected/1024faa3-55d0-47a5-ad2e-745ec92c0c89-kube-api-access-v9xwt\") pod \"route-controller-manager-6576b87f9c-b9wzm\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.368824 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.376287 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.381205 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr7b4\" (UniqueName: \"kubernetes.io/projected/10126253-d0e8-4f30-9047-1780a718e251-kube-api-access-fr7b4\") pod \"controller-manager-879f6c89f-ng4qc\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.384146 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.398352 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk5nl\" (UniqueName: \"kubernetes.io/projected/03b20d3e-89d8-4ff9-b56b-611728f913ee-kube-api-access-sk5nl\") pod \"authentication-operator-69f744f599-lxp5s\" (UID: \"03b20d3e-89d8-4ff9-b56b-611728f913ee\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.408511 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djhbf\" (UniqueName: \"kubernetes.io/projected/77e59ae9-2ed1-4c42-a17b-95c677bac560-kube-api-access-djhbf\") pod \"cluster-image-registry-operator-dc59b4c8b-pm6nw\" (UID: \"77e59ae9-2ed1-4c42-a17b-95c677bac560\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.411344 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.431838 4931 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.446601 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.451790 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2"] Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.452251 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.452949 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.465765 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.486964 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntwk2\" (UniqueName: \"kubernetes.io/projected/6d46b073-f023-4090-a6ec-4916356b1e4d-kube-api-access-ntwk2\") pod \"router-default-5444994796-ckz9k\" (UID: \"6d46b073-f023-4090-a6ec-4916356b1e4d\") " pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:21 crc kubenswrapper[4931]: W1201 15:03:21.528835 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc591dda_88eb_4eff_be2b_dfc142b4aa50.slice/crio-a502aba0ffe1fabfe1bfeb97607fbad5adae23166111e47099abc536d276b8ca WatchSource:0}: Error finding container a502aba0ffe1fabfe1bfeb97607fbad5adae23166111e47099abc536d276b8ca: Status 404 returned error can't find the container with id a502aba0ffe1fabfe1bfeb97607fbad5adae23166111e47099abc536d276b8ca Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.537282 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3c33d78-1d0c-4f91-a93c-e27fe57bbce1-metrics-tls\") pod \"ingress-operator-5b745b69d9-bwlrf\" (UID: \"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.537737 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aebf976-96d6-4a5b-8650-bcc1fbf09566-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4pkk4\" (UID: \"3aebf976-96d6-4a5b-8650-bcc1fbf09566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.537771 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwsrb\" (UniqueName: \"kubernetes.io/projected/3eb6cc74-4036-413a-b4e9-77ebfd72dfe1-kube-api-access-vwsrb\") pod \"service-ca-operator-777779d784-qlml2\" (UID: \"3eb6cc74-4036-413a-b4e9-77ebfd72dfe1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.537792 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skvvg\" (UniqueName: \"kubernetes.io/projected/d008c5dd-f44f-4509-b705-46b4c8819684-kube-api-access-skvvg\") pod \"collect-profiles-29410020-rqrgl\" (UID: \"d008c5dd-f44f-4509-b705-46b4c8819684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.537874 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.537917 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xdms\" (UniqueName: \"kubernetes.io/projected/c9176363-3b09-4c07-bf06-5a82e81a86e5-kube-api-access-7xdms\") pod \"ingress-canary-z4bhx\" (UID: \"c9176363-3b09-4c07-bf06-5a82e81a86e5\") " pod="openshift-ingress-canary/ingress-canary-z4bhx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.537955 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3c33d78-1d0c-4f91-a93c-e27fe57bbce1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bwlrf\" (UID: \"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.537970 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3eb6cc74-4036-413a-b4e9-77ebfd72dfe1-serving-cert\") pod \"service-ca-operator-777779d784-qlml2\" (UID: \"3eb6cc74-4036-413a-b4e9-77ebfd72dfe1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.537985 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/04b0fb8b-8694-4a16-affe-b4da74b2bd7b-signing-key\") pod \"service-ca-9c57cc56f-t6hrd\" (UID: \"04b0fb8b-8694-4a16-affe-b4da74b2bd7b\") " pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538004 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77n8j\" (UniqueName: \"kubernetes.io/projected/059abb1c-735b-49e6-9645-af1cb2e289b6-kube-api-access-77n8j\") pod \"multus-admission-controller-857f4d67dd-m5dvh\" (UID: \"059abb1c-735b-49e6-9645-af1cb2e289b6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m5dvh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538053 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2229b8c-268a-46fd-bb3d-442032e330ff-registry-certificates\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538120 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r66vn\" (UniqueName: \"kubernetes.io/projected/98a131c9-fc6c-4a27-a774-227258b380c0-kube-api-access-r66vn\") pod \"marketplace-operator-79b997595-gvtnh\" (UID: \"98a131c9-fc6c-4a27-a774-227258b380c0\") " pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538140 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/47bcdd61-f90c-433e-88d1-2677249c2a26-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jhppc\" (UID: \"47bcdd61-f90c-433e-88d1-2677249c2a26\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538161 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6615ae9-24a7-462b-af60-38c579d9529e-proxy-tls\") pod \"machine-config-controller-84d6567774-rsz87\" (UID: \"b6615ae9-24a7-462b-af60-38c579d9529e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538190 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af8a4a0-d2ae-4438-b7f3-33999d922811-config\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538208 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbmrn\" (UniqueName: \"kubernetes.io/projected/7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5-kube-api-access-gbmrn\") pod \"olm-operator-6b444d44fb-7fkbx\" (UID: \"7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538227 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pqrn\" (UniqueName: \"kubernetes.io/projected/48342a90-473d-4aef-a31c-dbc0b23fb352-kube-api-access-4pqrn\") pod \"migrator-59844c95c7-4b5h7\" (UID: \"48342a90-473d-4aef-a31c-dbc0b23fb352\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4b5h7" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538256 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98560308-e35a-429d-8788-5526499119ec-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhps2\" (UID: \"98560308-e35a-429d-8788-5526499119ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538275 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmrct\" (UniqueName: \"kubernetes.io/projected/47bcdd61-f90c-433e-88d1-2677249c2a26-kube-api-access-mmrct\") pod \"package-server-manager-789f6589d5-jhppc\" (UID: \"47bcdd61-f90c-433e-88d1-2677249c2a26\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538312 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d008c5dd-f44f-4509-b705-46b4c8819684-config-volume\") pod \"collect-profiles-29410020-rqrgl\" (UID: \"d008c5dd-f44f-4509-b705-46b4c8819684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538330 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zblgg\" (UniqueName: \"kubernetes.io/projected/98560308-e35a-429d-8788-5526499119ec-kube-api-access-zblgg\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhps2\" (UID: \"98560308-e35a-429d-8788-5526499119ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538474 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3c33d78-1d0c-4f91-a93c-e27fe57bbce1-trusted-ca\") pod \"ingress-operator-5b745b69d9-bwlrf\" (UID: \"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538494 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb6cc74-4036-413a-b4e9-77ebfd72dfe1-config\") pod \"service-ca-operator-777779d784-qlml2\" (UID: \"3eb6cc74-4036-413a-b4e9-77ebfd72dfe1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538516 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13bbcadd-9916-4e44-8167-d562215116aa-proxy-tls\") pod \"machine-config-operator-74547568cd-rnsbw\" (UID: \"13bbcadd-9916-4e44-8167-d562215116aa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538533 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5-srv-cert\") pod \"olm-operator-6b444d44fb-7fkbx\" (UID: \"7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538549 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dba22c56-9eb6-40b4-a6b6-559b3a847870-srv-cert\") pod \"catalog-operator-68c6474976-rmx6p\" (UID: \"dba22c56-9eb6-40b4-a6b6-559b3a847870\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" Dec 01 15:03:21 crc kubenswrapper[4931]: E1201 15:03:21.540870 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:22.040853436 +0000 UTC m=+148.466727103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.538572 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2af8a4a0-d2ae-4438-b7f3-33999d922811-etcd-client\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.544322 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm2n8\" (UniqueName: \"kubernetes.io/projected/2af8a4a0-d2ae-4438-b7f3-33999d922811-kube-api-access-pm2n8\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.544461 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2af8a4a0-d2ae-4438-b7f3-33999d922811-etcd-ca\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.547289 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjqt6\" (UniqueName: \"kubernetes.io/projected/0216ff96-a3b4-4486-91ab-f73485d18134-kube-api-access-sjqt6\") pod \"control-plane-machine-set-operator-78cbb6b69f-nx8gh\" (UID: \"0216ff96-a3b4-4486-91ab-f73485d18134\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.547374 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr899\" (UniqueName: \"kubernetes.io/projected/b6615ae9-24a7-462b-af60-38c579d9529e-kube-api-access-sr899\") pod \"machine-config-controller-84d6567774-rsz87\" (UID: \"b6615ae9-24a7-462b-af60-38c579d9529e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.547731 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/059abb1c-735b-49e6-9645-af1cb2e289b6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m5dvh\" (UID: \"059abb1c-735b-49e6-9645-af1cb2e289b6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m5dvh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.547806 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2229b8c-268a-46fd-bb3d-442032e330ff-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.547872 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/719cee4e-cf96-4192-b7ef-424be9e6759c-metrics-tls\") pod \"dns-operator-744455d44c-mkp6d\" (UID: \"719cee4e-cf96-4192-b7ef-424be9e6759c\") " pod="openshift-dns-operator/dns-operator-744455d44c-mkp6d" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.547943 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/970e4401-a0dd-4b50-9ca7-45ae25a382b2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2rcl2\" (UID: \"970e4401-a0dd-4b50-9ca7-45ae25a382b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.548572 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/970e4401-a0dd-4b50-9ca7-45ae25a382b2-config\") pod \"kube-apiserver-operator-766d6c64bb-2rcl2\" (UID: \"970e4401-a0dd-4b50-9ca7-45ae25a382b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.548594 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2af8a4a0-d2ae-4438-b7f3-33999d922811-etcd-service-ca\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.548898 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d008c5dd-f44f-4509-b705-46b4c8819684-secret-volume\") pod \"collect-profiles-29410020-rqrgl\" (UID: \"d008c5dd-f44f-4509-b705-46b4c8819684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.549027 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sng4g\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-kube-api-access-sng4g\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.549052 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2229b8c-268a-46fd-bb3d-442032e330ff-trusted-ca\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.549073 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b6615ae9-24a7-462b-af60-38c579d9529e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rsz87\" (UID: \"b6615ae9-24a7-462b-af60-38c579d9529e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.549121 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2229b8c-268a-46fd-bb3d-442032e330ff-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.549143 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3aebf976-96d6-4a5b-8650-bcc1fbf09566-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4pkk4\" (UID: \"3aebf976-96d6-4a5b-8650-bcc1fbf09566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.549197 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98560308-e35a-429d-8788-5526499119ec-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhps2\" (UID: \"98560308-e35a-429d-8788-5526499119ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.550986 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3aebf976-96d6-4a5b-8650-bcc1fbf09566-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4pkk4\" (UID: \"3aebf976-96d6-4a5b-8650-bcc1fbf09566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.551011 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be21d285-dd3e-4577-b1fa-913c8ec20cb5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-x77jt\" (UID: \"be21d285-dd3e-4577-b1fa-913c8ec20cb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.551067 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/04b0fb8b-8694-4a16-affe-b4da74b2bd7b-signing-cabundle\") pod \"service-ca-9c57cc56f-t6hrd\" (UID: \"04b0fb8b-8694-4a16-affe-b4da74b2bd7b\") " pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.551104 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af8a4a0-d2ae-4438-b7f3-33999d922811-serving-cert\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.551262 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7fkbx\" (UID: \"7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.551403 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dba22c56-9eb6-40b4-a6b6-559b3a847870-profile-collector-cert\") pod \"catalog-operator-68c6474976-rmx6p\" (UID: \"dba22c56-9eb6-40b4-a6b6-559b3a847870\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.553691 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.551461 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mktkj\" (UniqueName: \"kubernetes.io/projected/719cee4e-cf96-4192-b7ef-424be9e6759c-kube-api-access-mktkj\") pod \"dns-operator-744455d44c-mkp6d\" (UID: \"719cee4e-cf96-4192-b7ef-424be9e6759c\") " pod="openshift-dns-operator/dns-operator-744455d44c-mkp6d" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.553780 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2xpg\" (UniqueName: \"kubernetes.io/projected/c3c33d78-1d0c-4f91-a93c-e27fe57bbce1-kube-api-access-f2xpg\") pod \"ingress-operator-5b745b69d9-bwlrf\" (UID: \"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.556579 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/970e4401-a0dd-4b50-9ca7-45ae25a382b2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2rcl2\" (UID: \"970e4401-a0dd-4b50-9ca7-45ae25a382b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.556612 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-bound-sa-token\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.556643 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpwrb\" (UniqueName: \"kubernetes.io/projected/04b0fb8b-8694-4a16-affe-b4da74b2bd7b-kube-api-access-zpwrb\") pod \"service-ca-9c57cc56f-t6hrd\" (UID: \"04b0fb8b-8694-4a16-affe-b4da74b2bd7b\") " pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.556660 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be21d285-dd3e-4577-b1fa-913c8ec20cb5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-x77jt\" (UID: \"be21d285-dd3e-4577-b1fa-913c8ec20cb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.556680 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h4pc\" (UniqueName: \"kubernetes.io/projected/dba22c56-9eb6-40b4-a6b6-559b3a847870-kube-api-access-9h4pc\") pod \"catalog-operator-68c6474976-rmx6p\" (UID: \"dba22c56-9eb6-40b4-a6b6-559b3a847870\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.556700 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0216ff96-a3b4-4486-91ab-f73485d18134-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nx8gh\" (UID: \"0216ff96-a3b4-4486-91ab-f73485d18134\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.556747 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9176363-3b09-4c07-bf06-5a82e81a86e5-cert\") pod \"ingress-canary-z4bhx\" (UID: \"c9176363-3b09-4c07-bf06-5a82e81a86e5\") " pod="openshift-ingress-canary/ingress-canary-z4bhx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.556807 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13bbcadd-9916-4e44-8167-d562215116aa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rnsbw\" (UID: \"13bbcadd-9916-4e44-8167-d562215116aa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.556846 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be21d285-dd3e-4577-b1fa-913c8ec20cb5-config\") pod \"kube-controller-manager-operator-78b949d7b-x77jt\" (UID: \"be21d285-dd3e-4577-b1fa-913c8ec20cb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.556880 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98a131c9-fc6c-4a27-a774-227258b380c0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gvtnh\" (UID: \"98a131c9-fc6c-4a27-a774-227258b380c0\") " pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.556962 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98a131c9-fc6c-4a27-a774-227258b380c0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gvtnh\" (UID: \"98a131c9-fc6c-4a27-a774-227258b380c0\") " pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.556982 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vng9c\" (UniqueName: \"kubernetes.io/projected/13bbcadd-9916-4e44-8167-d562215116aa-kube-api-access-vng9c\") pod \"machine-config-operator-74547568cd-rnsbw\" (UID: \"13bbcadd-9916-4e44-8167-d562215116aa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.557056 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-registry-tls\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.557117 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/13bbcadd-9916-4e44-8167-d562215116aa-images\") pod \"machine-config-operator-74547568cd-rnsbw\" (UID: \"13bbcadd-9916-4e44-8167-d562215116aa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.561576 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2tbqk"] Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.583303 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.610423 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:21 crc kubenswrapper[4931]: W1201 15:03:21.633551 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d46b073_f023_4090_a6ec_4916356b1e4d.slice/crio-dd76d75c337901fa62ef00c708ae7cfa50f872a64c1fb5078596e0fcbd62c652 WatchSource:0}: Error finding container dd76d75c337901fa62ef00c708ae7cfa50f872a64c1fb5078596e0fcbd62c652: Status 404 returned error can't find the container with id dd76d75c337901fa62ef00c708ae7cfa50f872a64c1fb5078596e0fcbd62c652 Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.658497 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:21 crc kubenswrapper[4931]: E1201 15:03:21.658674 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:22.158637192 +0000 UTC m=+148.584510859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659114 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2229b8c-268a-46fd-bb3d-442032e330ff-registry-certificates\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659144 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r66vn\" (UniqueName: \"kubernetes.io/projected/98a131c9-fc6c-4a27-a774-227258b380c0-kube-api-access-r66vn\") pod \"marketplace-operator-79b997595-gvtnh\" (UID: \"98a131c9-fc6c-4a27-a774-227258b380c0\") " pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659163 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/47bcdd61-f90c-433e-88d1-2677249c2a26-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jhppc\" (UID: \"47bcdd61-f90c-433e-88d1-2677249c2a26\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659187 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6615ae9-24a7-462b-af60-38c579d9529e-proxy-tls\") pod \"machine-config-controller-84d6567774-rsz87\" (UID: \"b6615ae9-24a7-462b-af60-38c579d9529e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659207 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af8a4a0-d2ae-4438-b7f3-33999d922811-config\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659222 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbmrn\" (UniqueName: \"kubernetes.io/projected/7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5-kube-api-access-gbmrn\") pod \"olm-operator-6b444d44fb-7fkbx\" (UID: \"7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659236 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pqrn\" (UniqueName: \"kubernetes.io/projected/48342a90-473d-4aef-a31c-dbc0b23fb352-kube-api-access-4pqrn\") pod \"migrator-59844c95c7-4b5h7\" (UID: \"48342a90-473d-4aef-a31c-dbc0b23fb352\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4b5h7" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659259 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-mountpoint-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659277 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98560308-e35a-429d-8788-5526499119ec-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhps2\" (UID: \"98560308-e35a-429d-8788-5526499119ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659298 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4330e89b-63dc-4fa5-abee-3e383b77e182-config-volume\") pod \"dns-default-s2ws6\" (UID: \"4330e89b-63dc-4fa5-abee-3e383b77e182\") " pod="openshift-dns/dns-default-s2ws6" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659320 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d008c5dd-f44f-4509-b705-46b4c8819684-config-volume\") pod \"collect-profiles-29410020-rqrgl\" (UID: \"d008c5dd-f44f-4509-b705-46b4c8819684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659337 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmrct\" (UniqueName: \"kubernetes.io/projected/47bcdd61-f90c-433e-88d1-2677249c2a26-kube-api-access-mmrct\") pod \"package-server-manager-789f6589d5-jhppc\" (UID: \"47bcdd61-f90c-433e-88d1-2677249c2a26\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659354 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zblgg\" (UniqueName: \"kubernetes.io/projected/98560308-e35a-429d-8788-5526499119ec-kube-api-access-zblgg\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhps2\" (UID: \"98560308-e35a-429d-8788-5526499119ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659373 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3c33d78-1d0c-4f91-a93c-e27fe57bbce1-trusted-ca\") pod \"ingress-operator-5b745b69d9-bwlrf\" (UID: \"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659424 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb6cc74-4036-413a-b4e9-77ebfd72dfe1-config\") pod \"service-ca-operator-777779d784-qlml2\" (UID: \"3eb6cc74-4036-413a-b4e9-77ebfd72dfe1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659443 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-plugins-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659462 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13bbcadd-9916-4e44-8167-d562215116aa-proxy-tls\") pod \"machine-config-operator-74547568cd-rnsbw\" (UID: \"13bbcadd-9916-4e44-8167-d562215116aa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659478 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5-srv-cert\") pod \"olm-operator-6b444d44fb-7fkbx\" (UID: \"7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659494 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dba22c56-9eb6-40b4-a6b6-559b3a847870-srv-cert\") pod \"catalog-operator-68c6474976-rmx6p\" (UID: \"dba22c56-9eb6-40b4-a6b6-559b3a847870\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659514 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2af8a4a0-d2ae-4438-b7f3-33999d922811-etcd-client\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659533 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm2n8\" (UniqueName: \"kubernetes.io/projected/2af8a4a0-d2ae-4438-b7f3-33999d922811-kube-api-access-pm2n8\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659555 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2af8a4a0-d2ae-4438-b7f3-33999d922811-etcd-ca\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659572 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjqt6\" (UniqueName: \"kubernetes.io/projected/0216ff96-a3b4-4486-91ab-f73485d18134-kube-api-access-sjqt6\") pod \"control-plane-machine-set-operator-78cbb6b69f-nx8gh\" (UID: \"0216ff96-a3b4-4486-91ab-f73485d18134\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659589 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr899\" (UniqueName: \"kubernetes.io/projected/b6615ae9-24a7-462b-af60-38c579d9529e-kube-api-access-sr899\") pod \"machine-config-controller-84d6567774-rsz87\" (UID: \"b6615ae9-24a7-462b-af60-38c579d9529e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659608 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/719cee4e-cf96-4192-b7ef-424be9e6759c-metrics-tls\") pod \"dns-operator-744455d44c-mkp6d\" (UID: \"719cee4e-cf96-4192-b7ef-424be9e6759c\") " pod="openshift-dns-operator/dns-operator-744455d44c-mkp6d" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659626 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/059abb1c-735b-49e6-9645-af1cb2e289b6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m5dvh\" (UID: \"059abb1c-735b-49e6-9645-af1cb2e289b6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m5dvh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659644 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2229b8c-268a-46fd-bb3d-442032e330ff-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659670 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/970e4401-a0dd-4b50-9ca7-45ae25a382b2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2rcl2\" (UID: \"970e4401-a0dd-4b50-9ca7-45ae25a382b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659692 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/970e4401-a0dd-4b50-9ca7-45ae25a382b2-config\") pod \"kube-apiserver-operator-766d6c64bb-2rcl2\" (UID: \"970e4401-a0dd-4b50-9ca7-45ae25a382b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659712 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2af8a4a0-d2ae-4438-b7f3-33999d922811-etcd-service-ca\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659728 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d008c5dd-f44f-4509-b705-46b4c8819684-secret-volume\") pod \"collect-profiles-29410020-rqrgl\" (UID: \"d008c5dd-f44f-4509-b705-46b4c8819684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659767 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sng4g\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-kube-api-access-sng4g\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659786 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-registration-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659825 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2229b8c-268a-46fd-bb3d-442032e330ff-trusted-ca\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659868 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b6615ae9-24a7-462b-af60-38c579d9529e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rsz87\" (UID: \"b6615ae9-24a7-462b-af60-38c579d9529e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.659893 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r86mh\" (UniqueName: \"kubernetes.io/projected/c94c2159-be08-4c20-809f-2fc900cda887-kube-api-access-r86mh\") pod \"machine-config-server-csjj4\" (UID: \"c94c2159-be08-4c20-809f-2fc900cda887\") " pod="openshift-machine-config-operator/machine-config-server-csjj4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660079 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2229b8c-268a-46fd-bb3d-442032e330ff-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660153 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3aebf976-96d6-4a5b-8650-bcc1fbf09566-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4pkk4\" (UID: \"3aebf976-96d6-4a5b-8650-bcc1fbf09566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660204 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98560308-e35a-429d-8788-5526499119ec-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhps2\" (UID: \"98560308-e35a-429d-8788-5526499119ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660228 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3aebf976-96d6-4a5b-8650-bcc1fbf09566-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4pkk4\" (UID: \"3aebf976-96d6-4a5b-8650-bcc1fbf09566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660265 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c94c2159-be08-4c20-809f-2fc900cda887-node-bootstrap-token\") pod \"machine-config-server-csjj4\" (UID: \"c94c2159-be08-4c20-809f-2fc900cda887\") " pod="openshift-machine-config-operator/machine-config-server-csjj4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660288 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be21d285-dd3e-4577-b1fa-913c8ec20cb5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-x77jt\" (UID: \"be21d285-dd3e-4577-b1fa-913c8ec20cb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660309 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7fkbx\" (UID: \"7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660327 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/04b0fb8b-8694-4a16-affe-b4da74b2bd7b-signing-cabundle\") pod \"service-ca-9c57cc56f-t6hrd\" (UID: \"04b0fb8b-8694-4a16-affe-b4da74b2bd7b\") " pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660345 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n897j\" (UniqueName: \"kubernetes.io/projected/433fc98e-4157-4260-996c-ce59a2b6dc52-kube-api-access-n897j\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660362 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4330e89b-63dc-4fa5-abee-3e383b77e182-metrics-tls\") pod \"dns-default-s2ws6\" (UID: \"4330e89b-63dc-4fa5-abee-3e383b77e182\") " pod="openshift-dns/dns-default-s2ws6" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660475 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af8a4a0-d2ae-4438-b7f3-33999d922811-serving-cert\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660499 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-csi-data-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660545 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2xpg\" (UniqueName: \"kubernetes.io/projected/c3c33d78-1d0c-4f91-a93c-e27fe57bbce1-kube-api-access-f2xpg\") pod \"ingress-operator-5b745b69d9-bwlrf\" (UID: \"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660565 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dba22c56-9eb6-40b4-a6b6-559b3a847870-profile-collector-cert\") pod \"catalog-operator-68c6474976-rmx6p\" (UID: \"dba22c56-9eb6-40b4-a6b6-559b3a847870\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660629 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mktkj\" (UniqueName: \"kubernetes.io/projected/719cee4e-cf96-4192-b7ef-424be9e6759c-kube-api-access-mktkj\") pod \"dns-operator-744455d44c-mkp6d\" (UID: \"719cee4e-cf96-4192-b7ef-424be9e6759c\") " pod="openshift-dns-operator/dns-operator-744455d44c-mkp6d" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660649 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t8cb\" (UniqueName: \"kubernetes.io/projected/4330e89b-63dc-4fa5-abee-3e383b77e182-kube-api-access-9t8cb\") pod \"dns-default-s2ws6\" (UID: \"4330e89b-63dc-4fa5-abee-3e383b77e182\") " pod="openshift-dns/dns-default-s2ws6" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660680 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/970e4401-a0dd-4b50-9ca7-45ae25a382b2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2rcl2\" (UID: \"970e4401-a0dd-4b50-9ca7-45ae25a382b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660700 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be21d285-dd3e-4577-b1fa-913c8ec20cb5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-x77jt\" (UID: \"be21d285-dd3e-4577-b1fa-913c8ec20cb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660699 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2229b8c-268a-46fd-bb3d-442032e330ff-registry-certificates\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660718 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-bound-sa-token\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660793 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpwrb\" (UniqueName: \"kubernetes.io/projected/04b0fb8b-8694-4a16-affe-b4da74b2bd7b-kube-api-access-zpwrb\") pod \"service-ca-9c57cc56f-t6hrd\" (UID: \"04b0fb8b-8694-4a16-affe-b4da74b2bd7b\") " pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660824 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h4pc\" (UniqueName: \"kubernetes.io/projected/dba22c56-9eb6-40b4-a6b6-559b3a847870-kube-api-access-9h4pc\") pod \"catalog-operator-68c6474976-rmx6p\" (UID: \"dba22c56-9eb6-40b4-a6b6-559b3a847870\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660845 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0216ff96-a3b4-4486-91ab-f73485d18134-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nx8gh\" (UID: \"0216ff96-a3b4-4486-91ab-f73485d18134\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660870 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9176363-3b09-4c07-bf06-5a82e81a86e5-cert\") pod \"ingress-canary-z4bhx\" (UID: \"c9176363-3b09-4c07-bf06-5a82e81a86e5\") " pod="openshift-ingress-canary/ingress-canary-z4bhx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660897 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13bbcadd-9916-4e44-8167-d562215116aa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rnsbw\" (UID: \"13bbcadd-9916-4e44-8167-d562215116aa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660917 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be21d285-dd3e-4577-b1fa-913c8ec20cb5-config\") pod \"kube-controller-manager-operator-78b949d7b-x77jt\" (UID: \"be21d285-dd3e-4577-b1fa-913c8ec20cb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660936 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98a131c9-fc6c-4a27-a774-227258b380c0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gvtnh\" (UID: \"98a131c9-fc6c-4a27-a774-227258b380c0\") " pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660967 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98a131c9-fc6c-4a27-a774-227258b380c0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gvtnh\" (UID: \"98a131c9-fc6c-4a27-a774-227258b380c0\") " pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.660983 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-registry-tls\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661002 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vng9c\" (UniqueName: \"kubernetes.io/projected/13bbcadd-9916-4e44-8167-d562215116aa-kube-api-access-vng9c\") pod \"machine-config-operator-74547568cd-rnsbw\" (UID: \"13bbcadd-9916-4e44-8167-d562215116aa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661029 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-socket-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661063 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/13bbcadd-9916-4e44-8167-d562215116aa-images\") pod \"machine-config-operator-74547568cd-rnsbw\" (UID: \"13bbcadd-9916-4e44-8167-d562215116aa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661085 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c94c2159-be08-4c20-809f-2fc900cda887-certs\") pod \"machine-config-server-csjj4\" (UID: \"c94c2159-be08-4c20-809f-2fc900cda887\") " pod="openshift-machine-config-operator/machine-config-server-csjj4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661110 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3c33d78-1d0c-4f91-a93c-e27fe57bbce1-metrics-tls\") pod \"ingress-operator-5b745b69d9-bwlrf\" (UID: \"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661138 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aebf976-96d6-4a5b-8650-bcc1fbf09566-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4pkk4\" (UID: \"3aebf976-96d6-4a5b-8650-bcc1fbf09566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661155 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwsrb\" (UniqueName: \"kubernetes.io/projected/3eb6cc74-4036-413a-b4e9-77ebfd72dfe1-kube-api-access-vwsrb\") pod \"service-ca-operator-777779d784-qlml2\" (UID: \"3eb6cc74-4036-413a-b4e9-77ebfd72dfe1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661175 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skvvg\" (UniqueName: \"kubernetes.io/projected/d008c5dd-f44f-4509-b705-46b4c8819684-kube-api-access-skvvg\") pod \"collect-profiles-29410020-rqrgl\" (UID: \"d008c5dd-f44f-4509-b705-46b4c8819684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661210 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661233 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3eb6cc74-4036-413a-b4e9-77ebfd72dfe1-serving-cert\") pod \"service-ca-operator-777779d784-qlml2\" (UID: \"3eb6cc74-4036-413a-b4e9-77ebfd72dfe1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661248 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xdms\" (UniqueName: \"kubernetes.io/projected/c9176363-3b09-4c07-bf06-5a82e81a86e5-kube-api-access-7xdms\") pod \"ingress-canary-z4bhx\" (UID: \"c9176363-3b09-4c07-bf06-5a82e81a86e5\") " pod="openshift-ingress-canary/ingress-canary-z4bhx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661268 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3c33d78-1d0c-4f91-a93c-e27fe57bbce1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bwlrf\" (UID: \"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661288 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/04b0fb8b-8694-4a16-affe-b4da74b2bd7b-signing-key\") pod \"service-ca-9c57cc56f-t6hrd\" (UID: \"04b0fb8b-8694-4a16-affe-b4da74b2bd7b\") " pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661309 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77n8j\" (UniqueName: \"kubernetes.io/projected/059abb1c-735b-49e6-9645-af1cb2e289b6-kube-api-access-77n8j\") pod \"multus-admission-controller-857f4d67dd-m5dvh\" (UID: \"059abb1c-735b-49e6-9645-af1cb2e289b6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m5dvh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.661559 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98560308-e35a-429d-8788-5526499119ec-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhps2\" (UID: \"98560308-e35a-429d-8788-5526499119ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.662191 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d008c5dd-f44f-4509-b705-46b4c8819684-config-volume\") pod \"collect-profiles-29410020-rqrgl\" (UID: \"d008c5dd-f44f-4509-b705-46b4c8819684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.663432 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3eb6cc74-4036-413a-b4e9-77ebfd72dfe1-config\") pod \"service-ca-operator-777779d784-qlml2\" (UID: \"3eb6cc74-4036-413a-b4e9-77ebfd72dfe1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.671951 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13bbcadd-9916-4e44-8167-d562215116aa-proxy-tls\") pod \"machine-config-operator-74547568cd-rnsbw\" (UID: \"13bbcadd-9916-4e44-8167-d562215116aa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.672270 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2229b8c-268a-46fd-bb3d-442032e330ff-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: E1201 15:03:21.672466 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:22.172447248 +0000 UTC m=+148.598320915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.673001 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5-srv-cert\") pod \"olm-operator-6b444d44fb-7fkbx\" (UID: \"7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.673496 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/13bbcadd-9916-4e44-8167-d562215116aa-images\") pod \"machine-config-operator-74547568cd-rnsbw\" (UID: \"13bbcadd-9916-4e44-8167-d562215116aa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.673897 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/970e4401-a0dd-4b50-9ca7-45ae25a382b2-config\") pod \"kube-apiserver-operator-766d6c64bb-2rcl2\" (UID: \"970e4401-a0dd-4b50-9ca7-45ae25a382b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.675443 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be21d285-dd3e-4577-b1fa-913c8ec20cb5-config\") pod \"kube-controller-manager-operator-78b949d7b-x77jt\" (UID: \"be21d285-dd3e-4577-b1fa-913c8ec20cb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.676644 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aebf976-96d6-4a5b-8650-bcc1fbf09566-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4pkk4\" (UID: \"3aebf976-96d6-4a5b-8650-bcc1fbf09566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.677508 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dba22c56-9eb6-40b4-a6b6-559b3a847870-srv-cert\") pod \"catalog-operator-68c6474976-rmx6p\" (UID: \"dba22c56-9eb6-40b4-a6b6-559b3a847870\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.678465 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b6615ae9-24a7-462b-af60-38c579d9529e-proxy-tls\") pod \"machine-config-controller-84d6567774-rsz87\" (UID: \"b6615ae9-24a7-462b-af60-38c579d9529e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.678816 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2af8a4a0-d2ae-4438-b7f3-33999d922811-config\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.679472 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/13bbcadd-9916-4e44-8167-d562215116aa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rnsbw\" (UID: \"13bbcadd-9916-4e44-8167-d562215116aa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.680436 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/47bcdd61-f90c-433e-88d1-2677249c2a26-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jhppc\" (UID: \"47bcdd61-f90c-433e-88d1-2677249c2a26\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.680533 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/970e4401-a0dd-4b50-9ca7-45ae25a382b2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2rcl2\" (UID: \"970e4401-a0dd-4b50-9ca7-45ae25a382b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.682140 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b6615ae9-24a7-462b-af60-38c579d9529e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rsz87\" (UID: \"b6615ae9-24a7-462b-af60-38c579d9529e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.683238 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/04b0fb8b-8694-4a16-affe-b4da74b2bd7b-signing-cabundle\") pod \"service-ca-9c57cc56f-t6hrd\" (UID: \"04b0fb8b-8694-4a16-affe-b4da74b2bd7b\") " pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.684881 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2af8a4a0-d2ae-4438-b7f3-33999d922811-etcd-ca\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.685181 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2229b8c-268a-46fd-bb3d-442032e330ff-trusted-ca\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.686225 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2af8a4a0-d2ae-4438-b7f3-33999d922811-serving-cert\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.691741 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3eb6cc74-4036-413a-b4e9-77ebfd72dfe1-serving-cert\") pod \"service-ca-operator-777779d784-qlml2\" (UID: \"3eb6cc74-4036-413a-b4e9-77ebfd72dfe1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.692082 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.696680 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98a131c9-fc6c-4a27-a774-227258b380c0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gvtnh\" (UID: \"98a131c9-fc6c-4a27-a774-227258b380c0\") " pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.697679 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3c33d78-1d0c-4f91-a93c-e27fe57bbce1-trusted-ca\") pod \"ingress-operator-5b745b69d9-bwlrf\" (UID: \"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.698074 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2af8a4a0-d2ae-4438-b7f3-33999d922811-etcd-service-ca\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.712146 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/04b0fb8b-8694-4a16-affe-b4da74b2bd7b-signing-key\") pod \"service-ca-9c57cc56f-t6hrd\" (UID: \"04b0fb8b-8694-4a16-affe-b4da74b2bd7b\") " pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.714469 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-registry-tls\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.714921 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-bound-sa-token\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.718665 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0216ff96-a3b4-4486-91ab-f73485d18134-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-nx8gh\" (UID: \"0216ff96-a3b4-4486-91ab-f73485d18134\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.719439 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d008c5dd-f44f-4509-b705-46b4c8819684-secret-volume\") pod \"collect-profiles-29410020-rqrgl\" (UID: \"d008c5dd-f44f-4509-b705-46b4c8819684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.725803 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be21d285-dd3e-4577-b1fa-913c8ec20cb5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-x77jt\" (UID: \"be21d285-dd3e-4577-b1fa-913c8ec20cb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.726921 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2af8a4a0-d2ae-4438-b7f3-33999d922811-etcd-client\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.727140 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3c33d78-1d0c-4f91-a93c-e27fe57bbce1-metrics-tls\") pod \"ingress-operator-5b745b69d9-bwlrf\" (UID: \"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.727267 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98560308-e35a-429d-8788-5526499119ec-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhps2\" (UID: \"98560308-e35a-429d-8788-5526499119ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.727504 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3aebf976-96d6-4a5b-8650-bcc1fbf09566-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4pkk4\" (UID: \"3aebf976-96d6-4a5b-8650-bcc1fbf09566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.727722 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2229b8c-268a-46fd-bb3d-442032e330ff-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.728678 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dba22c56-9eb6-40b4-a6b6-559b3a847870-profile-collector-cert\") pod \"catalog-operator-68c6474976-rmx6p\" (UID: \"dba22c56-9eb6-40b4-a6b6-559b3a847870\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.730556 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9176363-3b09-4c07-bf06-5a82e81a86e5-cert\") pod \"ingress-canary-z4bhx\" (UID: \"c9176363-3b09-4c07-bf06-5a82e81a86e5\") " pod="openshift-ingress-canary/ingress-canary-z4bhx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.731869 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98a131c9-fc6c-4a27-a774-227258b380c0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gvtnh\" (UID: \"98a131c9-fc6c-4a27-a774-227258b380c0\") " pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.736621 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/059abb1c-735b-49e6-9645-af1cb2e289b6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m5dvh\" (UID: \"059abb1c-735b-49e6-9645-af1cb2e289b6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m5dvh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.737686 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/719cee4e-cf96-4192-b7ef-424be9e6759c-metrics-tls\") pod \"dns-operator-744455d44c-mkp6d\" (UID: \"719cee4e-cf96-4192-b7ef-424be9e6759c\") " pod="openshift-dns-operator/dns-operator-744455d44c-mkp6d" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.739898 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77n8j\" (UniqueName: \"kubernetes.io/projected/059abb1c-735b-49e6-9645-af1cb2e289b6-kube-api-access-77n8j\") pod \"multus-admission-controller-857f4d67dd-m5dvh\" (UID: \"059abb1c-735b-49e6-9645-af1cb2e289b6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m5dvh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.740878 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7fkbx\" (UID: \"7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.763011 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r66vn\" (UniqueName: \"kubernetes.io/projected/98a131c9-fc6c-4a27-a774-227258b380c0-kube-api-access-r66vn\") pod \"marketplace-operator-79b997595-gvtnh\" (UID: \"98a131c9-fc6c-4a27-a774-227258b380c0\") " pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.763144 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:21 crc kubenswrapper[4931]: E1201 15:03:21.763305 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:22.263283012 +0000 UTC m=+148.689156679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.763464 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.763614 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-mountpoint-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.763686 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4330e89b-63dc-4fa5-abee-3e383b77e182-config-volume\") pod \"dns-default-s2ws6\" (UID: \"4330e89b-63dc-4fa5-abee-3e383b77e182\") " pod="openshift-dns/dns-default-s2ws6" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.763762 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-plugins-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.763876 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmrct\" (UniqueName: \"kubernetes.io/projected/47bcdd61-f90c-433e-88d1-2677249c2a26-kube-api-access-mmrct\") pod \"package-server-manager-789f6589d5-jhppc\" (UID: \"47bcdd61-f90c-433e-88d1-2677249c2a26\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.763903 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-registration-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.763969 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-mountpoint-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.763973 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r86mh\" (UniqueName: \"kubernetes.io/projected/c94c2159-be08-4c20-809f-2fc900cda887-kube-api-access-r86mh\") pod \"machine-config-server-csjj4\" (UID: \"c94c2159-be08-4c20-809f-2fc900cda887\") " pod="openshift-machine-config-operator/machine-config-server-csjj4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.764064 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c94c2159-be08-4c20-809f-2fc900cda887-node-bootstrap-token\") pod \"machine-config-server-csjj4\" (UID: \"c94c2159-be08-4c20-809f-2fc900cda887\") " pod="openshift-machine-config-operator/machine-config-server-csjj4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.764117 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n897j\" (UniqueName: \"kubernetes.io/projected/433fc98e-4157-4260-996c-ce59a2b6dc52-kube-api-access-n897j\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.764138 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4330e89b-63dc-4fa5-abee-3e383b77e182-metrics-tls\") pod \"dns-default-s2ws6\" (UID: \"4330e89b-63dc-4fa5-abee-3e383b77e182\") " pod="openshift-dns/dns-default-s2ws6" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.764196 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-csi-data-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.764231 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t8cb\" (UniqueName: \"kubernetes.io/projected/4330e89b-63dc-4fa5-abee-3e383b77e182-kube-api-access-9t8cb\") pod \"dns-default-s2ws6\" (UID: \"4330e89b-63dc-4fa5-abee-3e383b77e182\") " pod="openshift-dns/dns-default-s2ws6" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.764352 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-socket-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.764396 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c94c2159-be08-4c20-809f-2fc900cda887-certs\") pod \"machine-config-server-csjj4\" (UID: \"c94c2159-be08-4c20-809f-2fc900cda887\") " pod="openshift-machine-config-operator/machine-config-server-csjj4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.765196 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-plugins-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.765278 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-csi-data-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.765327 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-socket-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.765633 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4330e89b-63dc-4fa5-abee-3e383b77e182-config-volume\") pod \"dns-default-s2ws6\" (UID: \"4330e89b-63dc-4fa5-abee-3e383b77e182\") " pod="openshift-dns/dns-default-s2ws6" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.765682 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/433fc98e-4157-4260-996c-ce59a2b6dc52-registration-dir\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:21 crc kubenswrapper[4931]: E1201 15:03:21.766003 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:22.265994979 +0000 UTC m=+148.691868646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.770782 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c94c2159-be08-4c20-809f-2fc900cda887-node-bootstrap-token\") pod \"machine-config-server-csjj4\" (UID: \"c94c2159-be08-4c20-809f-2fc900cda887\") " pod="openshift-machine-config-operator/machine-config-server-csjj4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.776451 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zblgg\" (UniqueName: \"kubernetes.io/projected/98560308-e35a-429d-8788-5526499119ec-kube-api-access-zblgg\") pod \"kube-storage-version-migrator-operator-b67b599dd-nhps2\" (UID: \"98560308-e35a-429d-8788-5526499119ec\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.782157 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c94c2159-be08-4c20-809f-2fc900cda887-certs\") pod \"machine-config-server-csjj4\" (UID: \"c94c2159-be08-4c20-809f-2fc900cda887\") " pod="openshift-machine-config-operator/machine-config-server-csjj4" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.785797 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4330e89b-63dc-4fa5-abee-3e383b77e182-metrics-tls\") pod \"dns-default-s2ws6\" (UID: \"4330e89b-63dc-4fa5-abee-3e383b77e182\") " pod="openshift-dns/dns-default-s2ws6" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.801153 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vng9c\" (UniqueName: \"kubernetes.io/projected/13bbcadd-9916-4e44-8167-d562215116aa-kube-api-access-vng9c\") pod \"machine-config-operator-74547568cd-rnsbw\" (UID: \"13bbcadd-9916-4e44-8167-d562215116aa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.804432 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.832561 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skvvg\" (UniqueName: \"kubernetes.io/projected/d008c5dd-f44f-4509-b705-46b4c8819684-kube-api-access-skvvg\") pod \"collect-profiles-29410020-rqrgl\" (UID: \"d008c5dd-f44f-4509-b705-46b4c8819684\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.848066 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sng4g\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-kube-api-access-sng4g\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.848234 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.865134 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:21 crc kubenswrapper[4931]: E1201 15:03:21.865606 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:22.365585744 +0000 UTC m=+148.791459411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.871283 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xdms\" (UniqueName: \"kubernetes.io/projected/c9176363-3b09-4c07-bf06-5a82e81a86e5-kube-api-access-7xdms\") pod \"ingress-canary-z4bhx\" (UID: \"c9176363-3b09-4c07-bf06-5a82e81a86e5\") " pod="openshift-ingress-canary/ingress-canary-z4bhx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.878958 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mbql2"] Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.888319 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pqrn\" (UniqueName: \"kubernetes.io/projected/48342a90-473d-4aef-a31c-dbc0b23fb352-kube-api-access-4pqrn\") pod \"migrator-59844c95c7-4b5h7\" (UID: \"48342a90-473d-4aef-a31c-dbc0b23fb352\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4b5h7" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.890362 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.899315 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.916614 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbmrn\" (UniqueName: \"kubernetes.io/projected/7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5-kube-api-access-gbmrn\") pod \"olm-operator-6b444d44fb-7fkbx\" (UID: \"7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.917006 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-m5dvh" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.932706 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3c33d78-1d0c-4f91-a93c-e27fe57bbce1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bwlrf\" (UID: \"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.945921 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpwrb\" (UniqueName: \"kubernetes.io/projected/04b0fb8b-8694-4a16-affe-b4da74b2bd7b-kube-api-access-zpwrb\") pod \"service-ca-9c57cc56f-t6hrd\" (UID: \"04b0fb8b-8694-4a16-affe-b4da74b2bd7b\") " pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.971755 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h4pc\" (UniqueName: \"kubernetes.io/projected/dba22c56-9eb6-40b4-a6b6-559b3a847870-kube-api-access-9h4pc\") pod \"catalog-operator-68c6474976-rmx6p\" (UID: \"dba22c56-9eb6-40b4-a6b6-559b3a847870\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.975497 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:21 crc kubenswrapper[4931]: E1201 15:03:21.975906 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:22.475887395 +0000 UTC m=+148.901761062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:21 crc kubenswrapper[4931]: I1201 15:03:21.981031 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-drngn"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.000651 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3aebf976-96d6-4a5b-8650-bcc1fbf09566-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4pkk4\" (UID: \"3aebf976-96d6-4a5b-8650-bcc1fbf09566\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.010064 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be21d285-dd3e-4577-b1fa-913c8ec20cb5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-x77jt\" (UID: \"be21d285-dd3e-4577-b1fa-913c8ec20cb5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.014521 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zz6cp"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.015884 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.040625 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjqt6\" (UniqueName: \"kubernetes.io/projected/0216ff96-a3b4-4486-91ab-f73485d18134-kube-api-access-sjqt6\") pod \"control-plane-machine-set-operator-78cbb6b69f-nx8gh\" (UID: \"0216ff96-a3b4-4486-91ab-f73485d18134\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.052930 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm2n8\" (UniqueName: \"kubernetes.io/projected/2af8a4a0-d2ae-4438-b7f3-33999d922811-kube-api-access-pm2n8\") pod \"etcd-operator-b45778765-dgzzm\" (UID: \"2af8a4a0-d2ae-4438-b7f3-33999d922811\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.062953 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-drngn" event={"ID":"c907e960-f833-4546-89df-491334c4fe72","Type":"ContainerStarted","Data":"5ce858f2ca74a69d3d395cc8649134ec0be1dfab849d5a2c2c3057360cc2e295"} Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.063369 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" event={"ID":"aabe6734-63b5-412a-80f3-c07b3a9b3071","Type":"ContainerStarted","Data":"c5aa4eca8fcc4f9dcda5122ac92d9ea5ceb9871cc870e1ee489ae9fd4aae3984"} Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.065621 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" event={"ID":"bc591dda-88eb-4eff-be2b-dfc142b4aa50","Type":"ContainerStarted","Data":"40af8c5e1355afa8a035a7b7e0fc4ddb99e92b4def34056befdf19829adef5ca"} Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.065678 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" event={"ID":"bc591dda-88eb-4eff-be2b-dfc142b4aa50","Type":"ContainerStarted","Data":"a502aba0ffe1fabfe1bfeb97607fbad5adae23166111e47099abc536d276b8ca"} Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.065916 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.067517 4931 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9tvd2 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.067553 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" podUID="bc591dda-88eb-4eff-be2b-dfc142b4aa50" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.069913 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwsrb\" (UniqueName: \"kubernetes.io/projected/3eb6cc74-4036-413a-b4e9-77ebfd72dfe1-kube-api-access-vwsrb\") pod \"service-ca-operator-777779d784-qlml2\" (UID: \"3eb6cc74-4036-413a-b4e9-77ebfd72dfe1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.076833 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.077421 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.078136 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" event={"ID":"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7","Type":"ContainerStarted","Data":"9716ce01483b67439a9f1506c2dce80e266c22ac03fc64b14164afbb3c639290"} Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.078750 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z4bhx" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.080032 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:22 crc kubenswrapper[4931]: E1201 15:03:22.080419 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:22.580401481 +0000 UTC m=+149.006275148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.083729 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.090016 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.112964 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4b5h7" Dec 01 15:03:22 crc kubenswrapper[4931]: W1201 15:03:22.113683 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf75a1fc_4c7b_4f3b_bc50_9d0fd4d7f52f.slice/crio-8f2ea4e0c2e9677ab4b7a0c8d951f398cf9cfe841aa207ebff063d65b735e988 WatchSource:0}: Error finding container 8f2ea4e0c2e9677ab4b7a0c8d951f398cf9cfe841aa207ebff063d65b735e988: Status 404 returned error can't find the container with id 8f2ea4e0c2e9677ab4b7a0c8d951f398cf9cfe841aa207ebff063d65b735e988 Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.113904 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ckz9k" event={"ID":"6d46b073-f023-4090-a6ec-4916356b1e4d","Type":"ContainerStarted","Data":"22003c8bfcc6e0233c8e19043c2cbf6c44cd699563609e5f181906604fcbf76b"} Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.113944 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-ckz9k" event={"ID":"6d46b073-f023-4090-a6ec-4916356b1e4d","Type":"ContainerStarted","Data":"dd76d75c337901fa62ef00c708ae7cfa50f872a64c1fb5078596e0fcbd62c652"} Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.114295 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mktkj\" (UniqueName: \"kubernetes.io/projected/719cee4e-cf96-4192-b7ef-424be9e6759c-kube-api-access-mktkj\") pod \"dns-operator-744455d44c-mkp6d\" (UID: \"719cee4e-cf96-4192-b7ef-424be9e6759c\") " pod="openshift-dns-operator/dns-operator-744455d44c-mkp6d" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.119457 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.132101 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/970e4401-a0dd-4b50-9ca7-45ae25a382b2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2rcl2\" (UID: \"970e4401-a0dd-4b50-9ca7-45ae25a382b2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.133956 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.134991 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.138975 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.145097 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2xpg\" (UniqueName: \"kubernetes.io/projected/c3c33d78-1d0c-4f91-a93c-e27fe57bbce1-kube-api-access-f2xpg\") pod \"ingress-operator-5b745b69d9-bwlrf\" (UID: \"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.157802 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr899\" (UniqueName: \"kubernetes.io/projected/b6615ae9-24a7-462b-af60-38c579d9529e-kube-api-access-sr899\") pod \"machine-config-controller-84d6567774-rsz87\" (UID: \"b6615ae9-24a7-462b-af60-38c579d9529e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" Dec 01 15:03:22 crc kubenswrapper[4931]: W1201 15:03:22.166567 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eef0925_d18a_4d34_9301_9cf5f900a39e.slice/crio-def05779449a22c92a425d263a6f7b96725c25733911c8dc28885e1ab324668b WatchSource:0}: Error finding container def05779449a22c92a425d263a6f7b96725c25733911c8dc28885e1ab324668b: Status 404 returned error can't find the container with id def05779449a22c92a425d263a6f7b96725c25733911c8dc28885e1ab324668b Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.168286 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.173160 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" event={"ID":"7b5e6777-79ed-4101-9259-38715fc413c6","Type":"ContainerStarted","Data":"a5c0821b3ebdccbca387888c57f09268cdee140f25a5c88114b410800c58c42b"} Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.173212 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" event={"ID":"7b5e6777-79ed-4101-9259-38715fc413c6","Type":"ContainerStarted","Data":"b264f8615e5404aa32214330718b4057c85a50354a4d6db06fac9bf612114c76"} Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.176893 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.183741 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:22 crc kubenswrapper[4931]: E1201 15:03:22.189292 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:22.689270252 +0000 UTC m=+149.115143919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.194646 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zkb6c"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.195059 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.196027 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.199100 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.214515 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.220736 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-knjbc"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.223343 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n897j\" (UniqueName: \"kubernetes.io/projected/433fc98e-4157-4260-996c-ce59a2b6dc52-kube-api-access-n897j\") pod \"csi-hostpathplugin-fwfk5\" (UID: \"433fc98e-4157-4260-996c-ce59a2b6dc52\") " pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.227495 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r86mh\" (UniqueName: \"kubernetes.io/projected/c94c2159-be08-4c20-809f-2fc900cda887-kube-api-access-r86mh\") pod \"machine-config-server-csjj4\" (UID: \"c94c2159-be08-4c20-809f-2fc900cda887\") " pod="openshift-machine-config-operator/machine-config-server-csjj4" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.230609 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.232174 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t8cb\" (UniqueName: \"kubernetes.io/projected/4330e89b-63dc-4fa5-abee-3e383b77e182-kube-api-access-9t8cb\") pod \"dns-default-s2ws6\" (UID: \"4330e89b-63dc-4fa5-abee-3e383b77e182\") " pod="openshift-dns/dns-default-s2ws6" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.232733 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l2v4w"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.248715 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.286770 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:22 crc kubenswrapper[4931]: E1201 15:03:22.286975 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:22.786949092 +0000 UTC m=+149.212822759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.287071 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:22 crc kubenswrapper[4931]: E1201 15:03:22.287406 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:22.787399365 +0000 UTC m=+149.213273022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.387856 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:22 crc kubenswrapper[4931]: E1201 15:03:22.388318 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:22.888298987 +0000 UTC m=+149.314172644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.390742 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lxp5s"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.404046 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mkp6d" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.417082 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.423296 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ng4qc"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.426136 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" Dec 01 15:03:22 crc kubenswrapper[4931]: W1201 15:03:22.483089 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77e59ae9_2ed1_4c42_a17b_95c677bac560.slice/crio-d09eefd8cdd67cbe77697cbdd806cb5d0639e9296545410f98358b1227fb8eb9 WatchSource:0}: Error finding container d09eefd8cdd67cbe77697cbdd806cb5d0639e9296545410f98358b1227fb8eb9: Status 404 returned error can't find the container with id d09eefd8cdd67cbe77697cbdd806cb5d0639e9296545410f98358b1227fb8eb9 Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.489314 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:22 crc kubenswrapper[4931]: E1201 15:03:22.489713 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:22.989697363 +0000 UTC m=+149.415571030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.524340 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-csjj4" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.537982 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s2ws6" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.554939 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.593052 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.594946 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:22 crc kubenswrapper[4931]: E1201 15:03:22.595865 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:23.095818155 +0000 UTC m=+149.521691812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.597745 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.598838 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gvtnh"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.599792 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m5dvh"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.667741 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:22 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:22 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:22 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.667787 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.697272 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:22 crc kubenswrapper[4931]: E1201 15:03:22.697824 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:23.197809179 +0000 UTC m=+149.623682846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.804550 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:22 crc kubenswrapper[4931]: E1201 15:03:22.804764 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:23.304727314 +0000 UTC m=+149.730600981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.808063 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:22 crc kubenswrapper[4931]: E1201 15:03:22.808686 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:23.308661026 +0000 UTC m=+149.734534693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.823290 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qlml2"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.834000 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.849062 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4"] Dec 01 15:03:22 crc kubenswrapper[4931]: I1201 15:03:22.910494 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:22 crc kubenswrapper[4931]: E1201 15:03:22.911431 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:23.411401602 +0000 UTC m=+149.837275289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.013350 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:23 crc kubenswrapper[4931]: E1201 15:03:23.013772 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:23.513756816 +0000 UTC m=+149.939630483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.115275 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:23 crc kubenswrapper[4931]: E1201 15:03:23.115435 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:23.615410259 +0000 UTC m=+150.041283926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.115823 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:23 crc kubenswrapper[4931]: E1201 15:03:23.116184 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:23.616171531 +0000 UTC m=+150.042045198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.221931 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zz6cp" event={"ID":"af75a1fc-4c7b-4f3b-bc50-9d0fd4d7f52f","Type":"ContainerStarted","Data":"16016cf0479bd5aae98fe6c77461f5970fba8745ddf8bb360b37332d25f510f8"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.222033 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zz6cp" event={"ID":"af75a1fc-4c7b-4f3b-bc50-9d0fd4d7f52f","Type":"ContainerStarted","Data":"8f2ea4e0c2e9677ab4b7a0c8d951f398cf9cfe841aa207ebff063d65b735e988"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.223347 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.223659 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zz6cp" Dec 01 15:03:23 crc kubenswrapper[4931]: E1201 15:03:23.223913 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:23.723894349 +0000 UTC m=+150.149768016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.225349 4931 patch_prober.go:28] interesting pod/downloads-7954f5f757-zz6cp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.225403 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zz6cp" podUID="af75a1fc-4c7b-4f3b-bc50-9d0fd4d7f52f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.227258 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" event={"ID":"7b5e6777-79ed-4101-9259-38715fc413c6","Type":"ContainerStarted","Data":"2e43c5522b87373971e9d4ecdff312705d021710403855e804b106baf765bf14"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.231180 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" event={"ID":"98a131c9-fc6c-4a27-a774-227258b380c0","Type":"ContainerStarted","Data":"a7f00b2e554dbc7a65f2e830b2be5044522a2e280da96b0e230b97acdc6afcb0"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.233809 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.236484 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" event={"ID":"77e59ae9-2ed1-4c42-a17b-95c677bac560","Type":"ContainerStarted","Data":"d09eefd8cdd67cbe77697cbdd806cb5d0639e9296545410f98358b1227fb8eb9"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.242226 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" event={"ID":"f62b6206-f14c-4f4d-a1c1-af09036abdcf","Type":"ContainerStarted","Data":"f335ad7d4724ba1ab371106db7ad885bca32547859a9658567b55f0283e43d68"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.242281 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" event={"ID":"f62b6206-f14c-4f4d-a1c1-af09036abdcf","Type":"ContainerStarted","Data":"5050ae4276c73c0a13c9cf3c723f12ce155faa8f83a40ca4500a60860fa6f257"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.248913 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-drngn" event={"ID":"c907e960-f833-4546-89df-491334c4fe72","Type":"ContainerStarted","Data":"71c10af47b1f2d94994301507ed5c9a83c1c5953a1925a178cf02e8519423d77"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.265644 4931 generic.go:334] "Generic (PLEG): container finished" podID="aabe6734-63b5-412a-80f3-c07b3a9b3071" containerID="de29857abb1412e6b3b4c09f15d9ee47eca6b5a9b5b78d16f062428d756b20b4" exitCode=0 Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.266830 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" event={"ID":"aabe6734-63b5-412a-80f3-c07b3a9b3071","Type":"ContainerDied","Data":"de29857abb1412e6b3b4c09f15d9ee47eca6b5a9b5b78d16f062428d756b20b4"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.275822 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" event={"ID":"3aebf976-96d6-4a5b-8650-bcc1fbf09566","Type":"ContainerStarted","Data":"c13723a4eae67fe9f1a5b7ea5bb98b362fbed5c33ed0625b74939ada9d678e95"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.279179 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" event={"ID":"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7","Type":"ContainerStarted","Data":"bf0f379bfbab9a1f2c4922ec06288d8f85cdc5b8ff8a75d1d6a21a175a8501c3"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.279270 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" event={"ID":"0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7","Type":"ContainerStarted","Data":"a0ca4887f4fe0e3ecd686524961febf41f7e9997cb769a51ff077620b7eb4b16"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.284588 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" event={"ID":"47bcdd61-f90c-433e-88d1-2677249c2a26","Type":"ContainerStarted","Data":"361f5a43d4ef73ed525fc1c314c1c24fe0dbac7e5e65a870c8d321732901ce5f"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.285788 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9" event={"ID":"39618657-35da-4567-a8f8-7f53bd12b8be","Type":"ContainerStarted","Data":"3bd27c5c19b1c2c42d7c844d515abbe9303d6eb3c1f4e48bcdac9257e2ab8dc0"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.287936 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m5dvh" event={"ID":"059abb1c-735b-49e6-9645-af1cb2e289b6","Type":"ContainerStarted","Data":"607f8f9233716bd42282027572be64db44e90c9c540f5c2a37fc4ec56bca22c3"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.290264 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" event={"ID":"3eb6cc74-4036-413a-b4e9-77ebfd72dfe1","Type":"ContainerStarted","Data":"4b7039f90b7f33e45686f1819089a0a2324d26e45c3b903a19dd95f684956c01"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.293526 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" event={"ID":"10126253-d0e8-4f30-9047-1780a718e251","Type":"ContainerStarted","Data":"af896787a11a18255bc4a1159c06753e7b1b57e19c8a52bb6afb58397f56c974"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.302732 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" event={"ID":"9eef0925-d18a-4d34-9301-9cf5f900a39e","Type":"ContainerStarted","Data":"def05779449a22c92a425d263a6f7b96725c25733911c8dc28885e1ab324668b"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.321840 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zkb6c" event={"ID":"cd7f3d80-238e-4e01-8d1a-4ee23eb29230","Type":"ContainerStarted","Data":"15a7735040878e9b7a2108fd8801d3cf179d7ce9d7c1c1e87aa8cd62459e5a63"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.324912 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:23 crc kubenswrapper[4931]: E1201 15:03:23.329067 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:23.829051383 +0000 UTC m=+150.254925050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.330693 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" event={"ID":"2a8ab13b-92c4-40b4-93fa-3bde54f728d6","Type":"ContainerStarted","Data":"716c59b0082227d165e50d1391adc24ecb1c88a71458b575f0443aa85a4e7200"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.330738 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" event={"ID":"2a8ab13b-92c4-40b4-93fa-3bde54f728d6","Type":"ContainerStarted","Data":"e5259459f4e4efee9f0a38dbf283363cd2aebf879a99f47b40d76c253c0a7ec1"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.339125 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" event={"ID":"a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d","Type":"ContainerStarted","Data":"361fb6b6614581beff456fd25c5471b925ecba241a0427795239582bfadad4ec"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.354364 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-csjj4" event={"ID":"c94c2159-be08-4c20-809f-2fc900cda887","Type":"ContainerStarted","Data":"2451de349330ece4ed1c737eb4e81c28d80fecc120936a96db1c9964d14a8c72"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.357055 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" event={"ID":"1024faa3-55d0-47a5-ad2e-745ec92c0c89","Type":"ContainerStarted","Data":"b81c33ae96360222912c6c32dd12da1802b97b0c6f102857b774531e3c80ba3e"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.357121 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" event={"ID":"1024faa3-55d0-47a5-ad2e-745ec92c0c89","Type":"ContainerStarted","Data":"ebbb262a8e8c34c06d48c63da9f6ca39d9b2a84e972ac2728597e0ff984ec062"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.357483 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.358504 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" event={"ID":"03b20d3e-89d8-4ff9-b56b-611728f913ee","Type":"ContainerStarted","Data":"75890cab51918706663df7b2ffdcb068de73d5928258662b3ed97ec09c1f81c5"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.359585 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" event={"ID":"98560308-e35a-429d-8788-5526499119ec","Type":"ContainerStarted","Data":"a052d65881dcd48699ede46b1c83b2e2f43a4a954e1b6bb9f4955e380ce131ba"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.360925 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" event={"ID":"d008c5dd-f44f-4509-b705-46b4c8819684","Type":"ContainerStarted","Data":"3a8821e9e5d4cc97f5dba915992803f3db2dbb5c3f438d62400cdb09ef7d92c8"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.360941 4931 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-b9wzm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.361001 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" podUID="1024faa3-55d0-47a5-ad2e-745ec92c0c89" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.362186 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" event={"ID":"855d1329-52b5-4c28-bef3-b18cb2a5e33e","Type":"ContainerStarted","Data":"b150ffc661e966b0b6b77cd043aed8ff8dac986b16089a16b8948d7839b620ba"} Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.415897 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.426787 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:23 crc kubenswrapper[4931]: E1201 15:03:23.428516 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:23.928497473 +0000 UTC m=+150.354371140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.515215 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9tvd2" podStartSLOduration=129.515194098 podStartE2EDuration="2m9.515194098s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:23.5114175 +0000 UTC m=+149.937291187" watchObservedRunningTime="2025-12-01 15:03:23.515194098 +0000 UTC m=+149.941067765" Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.529709 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:23 crc kubenswrapper[4931]: E1201 15:03:23.530191 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:24.030168558 +0000 UTC m=+150.456042235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.565874 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:23 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:23 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:23 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.565981 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.593819 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-ckz9k" podStartSLOduration=129.593788311 podStartE2EDuration="2m9.593788311s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:23.587499381 +0000 UTC m=+150.013373038" watchObservedRunningTime="2025-12-01 15:03:23.593788311 +0000 UTC m=+150.019661978" Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.664604 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:23 crc kubenswrapper[4931]: E1201 15:03:23.665313 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:24.16529063 +0000 UTC m=+150.591164297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.725663 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.731427 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4b5h7"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.752951 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dgzzm"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.758875 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.763971 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z4bhx"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.766347 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:23 crc kubenswrapper[4931]: E1201 15:03:23.766720 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:24.266705037 +0000 UTC m=+150.692578704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.770494 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t6hrd"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.771003 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.839890 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b527d" podStartSLOduration=130.839302148 podStartE2EDuration="2m10.839302148s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:23.819531862 +0000 UTC m=+150.245405549" watchObservedRunningTime="2025-12-01 15:03:23.839302148 +0000 UTC m=+150.265175935" Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.854125 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mkp6d"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.868784 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:23 crc kubenswrapper[4931]: E1201 15:03:23.869149 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:24.369131333 +0000 UTC m=+150.795005000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.883672 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.918973 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fwfk5"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.925623 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.930352 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.932188 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s2ws6"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.938831 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-drngn" podStartSLOduration=130.938807169 podStartE2EDuration="2m10.938807169s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:23.904857696 +0000 UTC m=+150.330731373" watchObservedRunningTime="2025-12-01 15:03:23.938807169 +0000 UTC m=+150.364680836" Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.939166 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt"] Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.950993 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mbql2" podStartSLOduration=129.950970568 podStartE2EDuration="2m9.950970568s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:23.946943683 +0000 UTC m=+150.372817350" watchObservedRunningTime="2025-12-01 15:03:23.950970568 +0000 UTC m=+150.376844235" Dec 01 15:03:23 crc kubenswrapper[4931]: I1201 15:03:23.980481 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:23 crc kubenswrapper[4931]: E1201 15:03:23.980824 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:24.480798763 +0000 UTC m=+150.906672430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.002240 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" podStartSLOduration=130.002218037 podStartE2EDuration="2m10.002218037s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:23.997398059 +0000 UTC m=+150.423271726" watchObservedRunningTime="2025-12-01 15:03:24.002218037 +0000 UTC m=+150.428091704" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.081506 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:24 crc kubenswrapper[4931]: E1201 15:03:24.081924 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:24.581906941 +0000 UTC m=+151.007780608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.142949 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c8v74" podStartSLOduration=130.14292632 podStartE2EDuration="2m10.14292632s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:24.115659179 +0000 UTC m=+150.541532856" watchObservedRunningTime="2025-12-01 15:03:24.14292632 +0000 UTC m=+150.568799987" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.144115 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-zz6cp" podStartSLOduration=131.144106514 podStartE2EDuration="2m11.144106514s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:24.141140039 +0000 UTC m=+150.567013706" watchObservedRunningTime="2025-12-01 15:03:24.144106514 +0000 UTC m=+150.569980181" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.183493 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:24 crc kubenswrapper[4931]: E1201 15:03:24.184149 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:24.684126971 +0000 UTC m=+151.110000638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.285921 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:24 crc kubenswrapper[4931]: E1201 15:03:24.286304 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:24.786287169 +0000 UTC m=+151.212160836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.387031 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:24 crc kubenswrapper[4931]: E1201 15:03:24.388340 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:24.888320284 +0000 UTC m=+151.314193961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.450527 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" event={"ID":"10126253-d0e8-4f30-9047-1780a718e251","Type":"ContainerStarted","Data":"d2ce9de9ba2aa1cc21f92a98118eac9dccf5e5cb27e58d6875547320d7bf2e33"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.453627 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.458917 4931 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ng4qc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.458972 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" podUID="10126253-d0e8-4f30-9047-1780a718e251" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.487298 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" event={"ID":"04b0fb8b-8694-4a16-affe-b4da74b2bd7b","Type":"ContainerStarted","Data":"2d253749962bf8596cd4082548692aca0312a04b87a2979bff567f8021475051"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.487345 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" event={"ID":"04b0fb8b-8694-4a16-affe-b4da74b2bd7b","Type":"ContainerStarted","Data":"1c6c7179f2825912e6f6439d27386925685676bd4090a1f3d8dd1cddaa1db012"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.488244 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:24 crc kubenswrapper[4931]: E1201 15:03:24.488625 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:24.988609089 +0000 UTC m=+151.414482756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.495709 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh" event={"ID":"0216ff96-a3b4-4486-91ab-f73485d18134","Type":"ContainerStarted","Data":"a2ff62500c108bdd02610d564bebcd370e8ff6e8ac9c6347a13221265e6603b8"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.508958 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" event={"ID":"d008c5dd-f44f-4509-b705-46b4c8819684","Type":"ContainerStarted","Data":"c592a428e1001ca7c30ec6909d8b6bb5d953909351bd5125d08ce329afa24f5f"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.512586 4931 generic.go:334] "Generic (PLEG): container finished" podID="9eef0925-d18a-4d34-9301-9cf5f900a39e" containerID="8f5cf5c3c4d717038ae7e15364bcf6c1e0ce0358f8be9aba87984ab49b4c4a40" exitCode=0 Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.512652 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" event={"ID":"9eef0925-d18a-4d34-9301-9cf5f900a39e","Type":"ContainerDied","Data":"8f5cf5c3c4d717038ae7e15364bcf6c1e0ce0358f8be9aba87984ab49b4c4a40"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.562305 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:24 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:24 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:24 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.562433 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.590240 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" event={"ID":"98a131c9-fc6c-4a27-a774-227258b380c0","Type":"ContainerStarted","Data":"9dc35aace1b82369a302f408af3779655c31571b76bee8ff330872097cb54890"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.590655 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.592773 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:03:24 crc kubenswrapper[4931]: E1201 15:03:24.593091 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:25.093075243 +0000 UTC m=+151.518948910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.603153 4931 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gvtnh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.603218 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" podUID="98a131c9-fc6c-4a27-a774-227258b380c0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.629143 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" event={"ID":"970e4401-a0dd-4b50-9ca7-45ae25a382b2","Type":"ContainerStarted","Data":"7f4ef4ba0f075b2195f844e3f6f1938ac41056e00774a497408de175fb592302"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.636897 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" event={"ID":"be21d285-dd3e-4577-b1fa-913c8ec20cb5","Type":"ContainerStarted","Data":"36af0d7d26ff0e95f3a6d340abca6e8514ec1c1e3c469bb5993239e0c1b21f47"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.657164 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" event={"ID":"b6615ae9-24a7-462b-af60-38c579d9529e","Type":"ContainerStarted","Data":"3e2947383614661e23f18990c1b712f993cbeb106f71efd66dc86ca326b1e8ff"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.657220 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" event={"ID":"b6615ae9-24a7-462b-af60-38c579d9529e","Type":"ContainerStarted","Data":"f478fed6082968cb2081625e4484c1d1a642574bc32fdce113d09848b6f65f03"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.669417 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" event={"ID":"3aebf976-96d6-4a5b-8650-bcc1fbf09566","Type":"ContainerStarted","Data":"224c64d35ff955f0cb2f39e594bd253a0428c9521677cca2a8624af36f4099df"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.692853 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" event={"ID":"855d1329-52b5-4c28-bef3-b18cb2a5e33e","Type":"ContainerStarted","Data":"6255575c2003ba37b60c277f22d1e96f61dbf956ef3c650152646b012f549566"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.693018 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.693630 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:24 crc kubenswrapper[4931]: E1201 15:03:24.693090 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:25.193074949 +0000 UTC m=+151.618948616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.693866 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:24 crc kubenswrapper[4931]: E1201 15:03:24.694421 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:25.194403647 +0000 UTC m=+151.620277314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.703001 4931 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-l2v4w container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" start-of-body= Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.703068 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" podUID="855d1329-52b5-4c28-bef3-b18cb2a5e33e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.733001 4931 generic.go:334] "Generic (PLEG): container finished" podID="f62b6206-f14c-4f4d-a1c1-af09036abdcf" containerID="f335ad7d4724ba1ab371106db7ad885bca32547859a9658567b55f0283e43d68" exitCode=0 Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.733119 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" event={"ID":"f62b6206-f14c-4f4d-a1c1-af09036abdcf","Type":"ContainerDied","Data":"f335ad7d4724ba1ab371106db7ad885bca32547859a9658567b55f0283e43d68"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.733153 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" event={"ID":"f62b6206-f14c-4f4d-a1c1-af09036abdcf","Type":"ContainerStarted","Data":"1397d6bef821958d5ee5787db35daf047d318b9d414f9a0255e0b21852f3bbea"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.733876 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.764376 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" event={"ID":"7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5","Type":"ContainerStarted","Data":"7ddc79dbbd67d88845c7910d5aad1f9318a061c6e92b65a56c51ff2f04822f30"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.766052 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.768044 4931 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7fkbx container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.768155 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" podUID="7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.772803 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" event={"ID":"77e59ae9-2ed1-4c42-a17b-95c677bac560","Type":"ContainerStarted","Data":"f7139e4e0fe33edbab2e264c19dc611ba32f44dff4566fb7c5ae3abbcf119916"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.776035 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" event={"ID":"13bbcadd-9916-4e44-8167-d562215116aa","Type":"ContainerStarted","Data":"7480d2eb9fb34bde15acab573f9787e895f335893e5fdb232f3c55700e4c872a"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.776066 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" event={"ID":"13bbcadd-9916-4e44-8167-d562215116aa","Type":"ContainerStarted","Data":"9f78e36dd1884b255779f75b93f8f8ed4b7b155d075580888fd974cb52de2a98"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.778537 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s2ws6" event={"ID":"4330e89b-63dc-4fa5-abee-3e383b77e182","Type":"ContainerStarted","Data":"513839f33c85c8cfe70315f7b55db2ed5acef02cfff3b76f78bfb0c0c9cb350b"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.795657 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:24 crc kubenswrapper[4931]: E1201 15:03:24.797292 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:25.297171703 +0000 UTC m=+151.723045390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.802793 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" event={"ID":"3eb6cc74-4036-413a-b4e9-77ebfd72dfe1","Type":"ContainerStarted","Data":"e47c5cc8b7ff04fdb80b8ed6ea522606348b78007815af769a8d87e8c0be3e70"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.809844 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" event={"ID":"03b20d3e-89d8-4ff9-b56b-611728f913ee","Type":"ContainerStarted","Data":"abea441d5fd31080c3850fcb55079d5d9fb8976d779a37186728c6648dd32b67"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.816168 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" event={"ID":"98560308-e35a-429d-8788-5526499119ec","Type":"ContainerStarted","Data":"a98b6aeed8bfd09359dc1a323807071c49f1ed909b0e571813a9adf99438abe7"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.827726 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" event={"ID":"dba22c56-9eb6-40b4-a6b6-559b3a847870","Type":"ContainerStarted","Data":"e6d65dad6fcf2023753ab4d5456c00b5adf642c157585302086d79d9b9dabb06"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.827794 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.829117 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m5dvh" event={"ID":"059abb1c-735b-49e6-9645-af1cb2e289b6","Type":"ContainerStarted","Data":"aacd7d282dae36ae53579521ebdbb9f3fc8400d6d7cb949f8c828f3508084f0c"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.829322 4931 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-rmx6p container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.829357 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" podUID="dba22c56-9eb6-40b4-a6b6-559b3a847870" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.854045 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" event={"ID":"aabe6734-63b5-412a-80f3-c07b3a9b3071","Type":"ContainerStarted","Data":"81b7be0fb8475d7ce7e5b24769fb1f7ad57444a28a256ac985b40bac79286958"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.859743 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zkb6c" event={"ID":"cd7f3d80-238e-4e01-8d1a-4ee23eb29230","Type":"ContainerStarted","Data":"7b470bab7b6e112d11804351a95cb4859bd22dc6b16441aae8f7dd528099821f"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.865875 4931 patch_prober.go:28] interesting pod/console-operator-58897d9998-zkb6c container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.865926 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zkb6c" podUID="cd7f3d80-238e-4e01-8d1a-4ee23eb29230" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.860622 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.878555 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" event={"ID":"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1","Type":"ContainerStarted","Data":"4ffed69ce637418836893e175d7296b794f896b135ef2b59891a2caf04366574"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.881939 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z4bhx" event={"ID":"c9176363-3b09-4c07-bf06-5a82e81a86e5","Type":"ContainerStarted","Data":"ff76c93c21014eca4eb9d3b2230672559e1ed2a2b8dba455434ebc8a3388700e"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.886198 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9" event={"ID":"39618657-35da-4567-a8f8-7f53bd12b8be","Type":"ContainerStarted","Data":"26b928e9cc6890cf5fab68698ae8c0712179e3f66a0f45d5924da42527141711"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.889893 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mkp6d" event={"ID":"719cee4e-cf96-4192-b7ef-424be9e6759c","Type":"ContainerStarted","Data":"4271010499b218c3b3503200f6c3ddefd3657f10e482861760448bda1d8224e9"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.893872 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-csjj4" event={"ID":"c94c2159-be08-4c20-809f-2fc900cda887","Type":"ContainerStarted","Data":"46e0a45064826fc57789cb52eb5eb82bce699c0fdc7ba4a3248688379cbb0693"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.897049 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:24 crc kubenswrapper[4931]: E1201 15:03:24.898776 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:25.398764045 +0000 UTC m=+151.824637712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.904317 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" event={"ID":"2af8a4a0-d2ae-4438-b7f3-33999d922811","Type":"ContainerStarted","Data":"abfac7ff567d61056e676a45bf5144d2437610309f34ead46ba5bd4f9724f2c7"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.911609 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" podStartSLOduration=130.911591533 podStartE2EDuration="2m10.911591533s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:24.908021791 +0000 UTC m=+151.333895468" watchObservedRunningTime="2025-12-01 15:03:24.911591533 +0000 UTC m=+151.337465200" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.913175 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4b5h7" event={"ID":"48342a90-473d-4aef-a31c-dbc0b23fb352","Type":"ContainerStarted","Data":"59cc3b51b9f102723488475b628370340c130b8e3ba7f565547b0c1a9d7b7f07"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.913220 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4b5h7" event={"ID":"48342a90-473d-4aef-a31c-dbc0b23fb352","Type":"ContainerStarted","Data":"83944e801889d94bf60df96689d6e5e7a9118fc490121dff10f059f3b0dd3c49"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.917771 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" event={"ID":"433fc98e-4157-4260-996c-ce59a2b6dc52","Type":"ContainerStarted","Data":"ad2da18c4879f654130e863dcb7db55176e602bcb294eb833027bf113b3a98d5"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.922031 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" event={"ID":"a1867e4a-4d98-4c43-9fd2-ab06ffb3fd2d","Type":"ContainerStarted","Data":"aa48118ddccccc66e735be6998f3d9758921d3c5025aed1d964f499736a766d9"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.926595 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" event={"ID":"47bcdd61-f90c-433e-88d1-2677249c2a26","Type":"ContainerStarted","Data":"a16fba9e8626c0729388516c2ae48245820394a3d152867a629e5d134099f87b"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.926634 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" event={"ID":"47bcdd61-f90c-433e-88d1-2677249c2a26","Type":"ContainerStarted","Data":"f7a3b047ef0e0b74d0073fb5463572ced97ecd24a64c863a1e4ea0c5c7f9a984"} Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.926649 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.928079 4931 patch_prober.go:28] interesting pod/downloads-7954f5f757-zz6cp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.928116 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zz6cp" podUID="af75a1fc-4c7b-4f3b-bc50-9d0fd4d7f52f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.942448 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.998000 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:24 crc kubenswrapper[4931]: E1201 15:03:24.998267 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:25.498249307 +0000 UTC m=+151.924122974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:24 crc kubenswrapper[4931]: I1201 15:03:24.998439 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:25 crc kubenswrapper[4931]: E1201 15:03:25.001191 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:25.501064257 +0000 UTC m=+151.926937944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.021556 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" podStartSLOduration=132.021538474 podStartE2EDuration="2m12.021538474s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.018885548 +0000 UTC m=+151.444759225" watchObservedRunningTime="2025-12-01 15:03:25.021538474 +0000 UTC m=+151.447412141" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.022053 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nhps2" podStartSLOduration=131.022045869 podStartE2EDuration="2m11.022045869s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:24.943191569 +0000 UTC m=+151.369065236" watchObservedRunningTime="2025-12-01 15:03:25.022045869 +0000 UTC m=+151.447919536" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.097759 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4pkk4" podStartSLOduration=131.097737169 podStartE2EDuration="2m11.097737169s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.090998185 +0000 UTC m=+151.516871852" watchObservedRunningTime="2025-12-01 15:03:25.097737169 +0000 UTC m=+151.523610836" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.099583 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:25 crc kubenswrapper[4931]: E1201 15:03:25.100854 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:25.600839567 +0000 UTC m=+152.026713224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.115213 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" podStartSLOduration=131.115198709 podStartE2EDuration="2m11.115198709s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.113928903 +0000 UTC m=+151.539802570" watchObservedRunningTime="2025-12-01 15:03:25.115198709 +0000 UTC m=+151.541072376" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.151378 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-zkb6c" podStartSLOduration=132.151356745 podStartE2EDuration="2m12.151356745s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.149875043 +0000 UTC m=+151.575748730" watchObservedRunningTime="2025-12-01 15:03:25.151356745 +0000 UTC m=+151.577230412" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.187199 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" podStartSLOduration=131.187180482 podStartE2EDuration="2m11.187180482s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.184130095 +0000 UTC m=+151.610003752" watchObservedRunningTime="2025-12-01 15:03:25.187180482 +0000 UTC m=+151.613054149" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.204115 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:25 crc kubenswrapper[4931]: E1201 15:03:25.204466 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:25.704454957 +0000 UTC m=+152.130328624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.231811 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-t6hrd" podStartSLOduration=131.231790411 podStartE2EDuration="2m11.231790411s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.230900696 +0000 UTC m=+151.656774353" watchObservedRunningTime="2025-12-01 15:03:25.231790411 +0000 UTC m=+151.657664078" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.310067 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:25 crc kubenswrapper[4931]: E1201 15:03:25.310508 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:25.810492357 +0000 UTC m=+152.236366024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.311037 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qlml2" podStartSLOduration=131.311020512 podStartE2EDuration="2m11.311020512s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.274749022 +0000 UTC m=+151.700622689" watchObservedRunningTime="2025-12-01 15:03:25.311020512 +0000 UTC m=+151.736894179" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.312229 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" podStartSLOduration=131.312223617 podStartE2EDuration="2m11.312223617s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.310756904 +0000 UTC m=+151.736630571" watchObservedRunningTime="2025-12-01 15:03:25.312223617 +0000 UTC m=+151.738097284" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.351346 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" podStartSLOduration=131.351325967 podStartE2EDuration="2m11.351325967s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.3486277 +0000 UTC m=+151.774501367" watchObservedRunningTime="2025-12-01 15:03:25.351325967 +0000 UTC m=+151.777199634" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.383678 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-lxp5s" podStartSLOduration=132.383470989 podStartE2EDuration="2m12.383470989s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.383378076 +0000 UTC m=+151.809251743" watchObservedRunningTime="2025-12-01 15:03:25.383470989 +0000 UTC m=+151.809344656" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.417347 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:25 crc kubenswrapper[4931]: E1201 15:03:25.417810 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:25.917785972 +0000 UTC m=+152.343659629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.436633 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pm6nw" podStartSLOduration=131.436586671 podStartE2EDuration="2m11.436586671s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.424556806 +0000 UTC m=+151.850430473" watchObservedRunningTime="2025-12-01 15:03:25.436586671 +0000 UTC m=+151.862460338" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.514032 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" podStartSLOduration=131.51401001 podStartE2EDuration="2m11.51401001s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.500920145 +0000 UTC m=+151.926793832" watchObservedRunningTime="2025-12-01 15:03:25.51401001 +0000 UTC m=+151.939883677" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.514548 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" podStartSLOduration=132.514544756 podStartE2EDuration="2m12.514544756s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.468941709 +0000 UTC m=+151.894815386" watchObservedRunningTime="2025-12-01 15:03:25.514544756 +0000 UTC m=+151.940418423" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.521039 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:25 crc kubenswrapper[4931]: E1201 15:03:25.521462 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:26.021444853 +0000 UTC m=+152.447318520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.563623 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:25 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:25 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:25 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.564002 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.589925 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" podStartSLOduration=131.589905036 podStartE2EDuration="2m11.589905036s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.588895137 +0000 UTC m=+152.014768804" watchObservedRunningTime="2025-12-01 15:03:25.589905036 +0000 UTC m=+152.015778703" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.627467 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:25 crc kubenswrapper[4931]: E1201 15:03:25.627776 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:26.127763991 +0000 UTC m=+152.553637658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.637506 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-csjj4" podStartSLOduration=6.63748322 podStartE2EDuration="6.63748322s" podCreationTimestamp="2025-12-01 15:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.632060654 +0000 UTC m=+152.057934321" watchObservedRunningTime="2025-12-01 15:03:25.63748322 +0000 UTC m=+152.063356877" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.718810 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-z4bhx" podStartSLOduration=6.71879071 podStartE2EDuration="6.71879071s" podCreationTimestamp="2025-12-01 15:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.675744806 +0000 UTC m=+152.101618473" watchObservedRunningTime="2025-12-01 15:03:25.71879071 +0000 UTC m=+152.144664377" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.734111 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:25 crc kubenswrapper[4931]: E1201 15:03:25.734586 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:26.234569332 +0000 UTC m=+152.660442999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.766574 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" podStartSLOduration=131.766555539 podStartE2EDuration="2m11.766555539s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.766090496 +0000 UTC m=+152.191964163" watchObservedRunningTime="2025-12-01 15:03:25.766555539 +0000 UTC m=+152.192429206" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.767578 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7cvqt" podStartSLOduration=132.767572158 podStartE2EDuration="2m12.767572158s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:25.718595894 +0000 UTC m=+152.144469561" watchObservedRunningTime="2025-12-01 15:03:25.767572158 +0000 UTC m=+152.193445825" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.836168 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:25 crc kubenswrapper[4931]: E1201 15:03:25.836663 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:26.336643598 +0000 UTC m=+152.762517265 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.934483 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" event={"ID":"aabe6734-63b5-412a-80f3-c07b3a9b3071","Type":"ContainerStarted","Data":"55b787f0c6ecfec72d6bdec2885ae9667ede05f932ff1327dce81bdf9aa69675"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.936480 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" event={"ID":"be21d285-dd3e-4577-b1fa-913c8ec20cb5","Type":"ContainerStarted","Data":"dabe8a6507569e802216a296ffeec7396c852ea3278a0dcdc0e5a991d1cc097d"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.937003 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:25 crc kubenswrapper[4931]: E1201 15:03:25.937252 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:26.437223851 +0000 UTC m=+152.863097528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.937464 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:25 crc kubenswrapper[4931]: E1201 15:03:25.937938 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:26.437921381 +0000 UTC m=+152.863795048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.938484 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" event={"ID":"9eef0925-d18a-4d34-9301-9cf5f900a39e","Type":"ContainerStarted","Data":"47c318b8d8cc71c2f9fece4f850b496ea0b9e45a47444b5085be509506f0f5f4"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.939598 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" event={"ID":"433fc98e-4157-4260-996c-ce59a2b6dc52","Type":"ContainerStarted","Data":"924f26e0c49cd59cacafbb7a1daf6a31a3be66e425acea98f678776df888432a"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.940788 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" event={"ID":"970e4401-a0dd-4b50-9ca7-45ae25a382b2","Type":"ContainerStarted","Data":"03e6e405abe53540f36314887fc693640a254f52bc96e15b4dd5c9a4033c83f5"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.941950 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mkp6d" event={"ID":"719cee4e-cf96-4192-b7ef-424be9e6759c","Type":"ContainerStarted","Data":"b00c96c1e230315c6576b41d32d4cb4f75733cfa66fd641ea8e6d339e0a2b2db"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.941985 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mkp6d" event={"ID":"719cee4e-cf96-4192-b7ef-424be9e6759c","Type":"ContainerStarted","Data":"9c7a75350e412640245a0a472b13954db63378d6bdd159a0022c441ae0c6a4fb"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.943117 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh" event={"ID":"0216ff96-a3b4-4486-91ab-f73485d18134","Type":"ContainerStarted","Data":"a3705501656b0dd9c60236a237a5b3aaa5d00d2a1a57643a726cc9092b7ec363"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.944514 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rnsbw" event={"ID":"13bbcadd-9916-4e44-8167-d562215116aa","Type":"ContainerStarted","Data":"8b846f33340a99c7b2fd8b0d150fd86cfe18f56336c6f2f1136434e2aaa5e011"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.947106 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" event={"ID":"b6615ae9-24a7-462b-af60-38c579d9529e","Type":"ContainerStarted","Data":"6f3c44126d99d0939159ee96ab06f4337074d87ce8bf699fa1ee9d6a0d1082a8"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.948950 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4b5h7" event={"ID":"48342a90-473d-4aef-a31c-dbc0b23fb352","Type":"ContainerStarted","Data":"e67a9eca4adbac26c46e32e5affb9daea7cf9d9c1b0bdb78786ede174ba1b493"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.950755 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dgzzm" event={"ID":"2af8a4a0-d2ae-4438-b7f3-33999d922811","Type":"ContainerStarted","Data":"70a0eccbd4ecaa38a6bd78b02e31c1cb11cfc50467f717db807b84b5e15ff9d3"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.952020 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" event={"ID":"7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5","Type":"ContainerStarted","Data":"e8db9601ad01f0923fd7f2f709c5f10df5291f05bf4ccfe9982480945600f2fa"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.952724 4931 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7fkbx container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.952780 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" podUID="7bd585f7-1b0a-4c75-bd86-3cfa9e155cd5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.953828 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9" event={"ID":"39618657-35da-4567-a8f8-7f53bd12b8be","Type":"ContainerStarted","Data":"a0aa677207c943f74fc8ff3a2bc27b0bc9594856676839f1ccfb57d638549aa0"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.955050 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" event={"ID":"dba22c56-9eb6-40b4-a6b6-559b3a847870","Type":"ContainerStarted","Data":"fa0750caa322381b605263a5b759f60ecc2ad9edb8b4e1a1a0bc47908b92c6df"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.955628 4931 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-rmx6p container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.955695 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" podUID="dba22c56-9eb6-40b4-a6b6-559b3a847870" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.957139 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" event={"ID":"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1","Type":"ContainerStarted","Data":"d7cbff307797349e8b2c98825cbda6422bbd564ce2b639d0bdd2bb871cada4f3"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.957173 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" event={"ID":"c3c33d78-1d0c-4f91-a93c-e27fe57bbce1","Type":"ContainerStarted","Data":"679b67269a3646afdb0ec2172140297b3c470fe90fea1e0060433027bbe537d3"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.958830 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s2ws6" event={"ID":"4330e89b-63dc-4fa5-abee-3e383b77e182","Type":"ContainerStarted","Data":"ebf9975c0011b201776f81e7673cd96bb6f49e75c2b388c9a9d3b41c515b9dac"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.958857 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s2ws6" event={"ID":"4330e89b-63dc-4fa5-abee-3e383b77e182","Type":"ContainerStarted","Data":"826f2bf8c9d6d6e1109929b831d559c11b4d967b0b996fcb4737bac44c57f964"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.958903 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-s2ws6" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.960981 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m5dvh" event={"ID":"059abb1c-735b-49e6-9645-af1cb2e289b6","Type":"ContainerStarted","Data":"6454c54b415a1fb3ddf4ac537de14af4ec43c420612c88cba3956e250f5ec800"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.963119 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z4bhx" event={"ID":"c9176363-3b09-4c07-bf06-5a82e81a86e5","Type":"ContainerStarted","Data":"f4d047cb4650c1699f5d0cb9b0d46ac3e665bfef57961c00017e3a1a14f3a5ce"} Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.964084 4931 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gvtnh container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.964129 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" podUID="98a131c9-fc6c-4a27-a774-227258b380c0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.965122 4931 patch_prober.go:28] interesting pod/console-operator-58897d9998-zkb6c container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.965163 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zkb6c" podUID="cd7f3d80-238e-4e01-8d1a-4ee23eb29230" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 01 15:03:25 crc kubenswrapper[4931]: I1201 15:03:25.997770 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.038617 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:26 crc kubenswrapper[4931]: E1201 15:03:26.039336 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:26.539307567 +0000 UTC m=+152.965181234 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.142718 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:26 crc kubenswrapper[4931]: E1201 15:03:26.143109 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:26.643088462 +0000 UTC m=+153.068962129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.221712 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" podStartSLOduration=133.221690425 podStartE2EDuration="2m13.221690425s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:26.112747392 +0000 UTC m=+152.538621059" watchObservedRunningTime="2025-12-01 15:03:26.221690425 +0000 UTC m=+152.647564102" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.237252 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.237322 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.239222 4931 patch_prober.go:28] interesting pod/apiserver-76f77b778f-2tbqk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.239349 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" podUID="aabe6734-63b5-412a-80f3-c07b3a9b3071" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.251417 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:26 crc kubenswrapper[4931]: E1201 15:03:26.251718 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:26.751685505 +0000 UTC m=+153.177559172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.251857 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:26 crc kubenswrapper[4931]: E1201 15:03:26.252457 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:26.752449817 +0000 UTC m=+153.178323484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.282777 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.290577 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.290942 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.293334 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-x77jt" podStartSLOduration=132.293317478 podStartE2EDuration="2m12.293317478s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:26.223311141 +0000 UTC m=+152.649184808" watchObservedRunningTime="2025-12-01 15:03:26.293317478 +0000 UTC m=+152.719191145" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.296952 4931 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-p2z4b container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.297009 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" podUID="9eef0925-d18a-4d34-9301-9cf5f900a39e" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.354105 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:26 crc kubenswrapper[4931]: E1201 15:03:26.354469 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:26.85445352 +0000 UTC m=+153.280327187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.381272 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mkp6d" podStartSLOduration=132.381251508 podStartE2EDuration="2m12.381251508s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:26.330243656 +0000 UTC m=+152.756117323" watchObservedRunningTime="2025-12-01 15:03:26.381251508 +0000 UTC m=+152.807125175" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.381850 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2rcl2" podStartSLOduration=132.381843025 podStartE2EDuration="2m12.381843025s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:26.378943392 +0000 UTC m=+152.804817059" watchObservedRunningTime="2025-12-01 15:03:26.381843025 +0000 UTC m=+152.807716692" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.447014 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-nx8gh" podStartSLOduration=132.446996413 podStartE2EDuration="2m12.446996413s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:26.444857372 +0000 UTC m=+152.870731039" watchObservedRunningTime="2025-12-01 15:03:26.446996413 +0000 UTC m=+152.872870080" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.456303 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:26 crc kubenswrapper[4931]: E1201 15:03:26.456678 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:26.95666292 +0000 UTC m=+153.382536587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.481148 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bwlrf" podStartSLOduration=132.481120071 podStartE2EDuration="2m12.481120071s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:26.474266014 +0000 UTC m=+152.900139681" watchObservedRunningTime="2025-12-01 15:03:26.481120071 +0000 UTC m=+152.906993738" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.558011 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:26 crc kubenswrapper[4931]: E1201 15:03:26.558318 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:27.058292133 +0000 UTC m=+153.484165800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.565615 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:26 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:26 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:26 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.565685 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.568834 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" podStartSLOduration=132.568819635 podStartE2EDuration="2m12.568819635s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:26.561048712 +0000 UTC m=+152.986922399" watchObservedRunningTime="2025-12-01 15:03:26.568819635 +0000 UTC m=+152.994693312" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.635154 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnlv9" podStartSLOduration=133.635133785 podStartE2EDuration="2m13.635133785s" podCreationTimestamp="2025-12-01 15:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:26.600355329 +0000 UTC m=+153.026228996" watchObservedRunningTime="2025-12-01 15:03:26.635133785 +0000 UTC m=+153.061007452" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.637962 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4b5h7" podStartSLOduration=132.637955936 podStartE2EDuration="2m12.637955936s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:26.634778865 +0000 UTC m=+153.060652532" watchObservedRunningTime="2025-12-01 15:03:26.637955936 +0000 UTC m=+153.063829593" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.662321 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:26 crc kubenswrapper[4931]: E1201 15:03:26.662672 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:27.162659364 +0000 UTC m=+153.588533031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.664266 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-s2ws6" podStartSLOduration=7.66425477 podStartE2EDuration="7.66425477s" podCreationTimestamp="2025-12-01 15:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:26.663632262 +0000 UTC m=+153.089505919" watchObservedRunningTime="2025-12-01 15:03:26.66425477 +0000 UTC m=+153.090128437" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.692165 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsz87" podStartSLOduration=132.692144609 podStartE2EDuration="2m12.692144609s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:26.692049607 +0000 UTC m=+153.117923264" watchObservedRunningTime="2025-12-01 15:03:26.692144609 +0000 UTC m=+153.118018276" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.747338 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-m5dvh" podStartSLOduration=132.747319641 podStartE2EDuration="2m12.747319641s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:26.745072957 +0000 UTC m=+153.170946624" watchObservedRunningTime="2025-12-01 15:03:26.747319641 +0000 UTC m=+153.173193308" Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.765081 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:26 crc kubenswrapper[4931]: E1201 15:03:26.765314 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:27.265285886 +0000 UTC m=+153.691159553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.765404 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:26 crc kubenswrapper[4931]: E1201 15:03:26.765902 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:27.265881403 +0000 UTC m=+153.691755070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.867736 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:26 crc kubenswrapper[4931]: E1201 15:03:26.868086 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:27.368069192 +0000 UTC m=+153.793942859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.968806 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:26 crc kubenswrapper[4931]: E1201 15:03:26.969196 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:27.46918311 +0000 UTC m=+153.895056777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:26 crc kubenswrapper[4931]: I1201 15:03:26.995464 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" event={"ID":"433fc98e-4157-4260-996c-ce59a2b6dc52","Type":"ContainerStarted","Data":"1d42ab53229e423d3f33c834ebd037e4d452cad9108c963574a593f40057d05a"} Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.003892 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.009824 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rmx6p" Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.041818 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fkbx" Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.070243 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.072679 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:27.572655366 +0000 UTC m=+153.998529033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.173812 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.174193 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:27.674179306 +0000 UTC m=+154.100052973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.277949 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.278359 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:27.778340012 +0000 UTC m=+154.204213679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.379653 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.380045 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:27.880032747 +0000 UTC m=+154.305906414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.392466 4931 patch_prober.go:28] interesting pod/console-operator-58897d9998-zkb6c container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 01 15:03:27 crc kubenswrapper[4931]: [+]log ok Dec 01 15:03:27 crc kubenswrapper[4931]: [-]poststarthook/max-in-flight-filter failed: reason withheld Dec 01 15:03:27 crc kubenswrapper[4931]: [-]poststarthook/storage-object-count-tracker-hook failed: reason withheld Dec 01 15:03:27 crc kubenswrapper[4931]: [+]shutdown ok Dec 01 15:03:27 crc kubenswrapper[4931]: readyz check failed Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.392562 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zkb6c" podUID="cd7f3d80-238e-4e01-8d1a-4ee23eb29230" containerName="console-operator" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.481129 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.481421 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:27.981406421 +0000 UTC m=+154.407280088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.489177 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-knjbc" Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.559963 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:27 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:27 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:27 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.560043 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.582708 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.583182 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.083163298 +0000 UTC m=+154.509036955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.683821 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.684032 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.184002548 +0000 UTC m=+154.609876205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.684147 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.684540 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.184528274 +0000 UTC m=+154.610401941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.785700 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.785926 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.285897869 +0000 UTC m=+154.711771536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.786582 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.786940 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.286922688 +0000 UTC m=+154.712796355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.888594 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.888807 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.388775938 +0000 UTC m=+154.814649605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.888941 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.889418 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.389404706 +0000 UTC m=+154.815278373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.990491 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.990725 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.490690449 +0000 UTC m=+154.916564126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:27 crc kubenswrapper[4931]: I1201 15:03:27.990770 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:27 crc kubenswrapper[4931]: E1201 15:03:27.991137 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.491123752 +0000 UTC m=+154.916997419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.002247 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" event={"ID":"433fc98e-4157-4260-996c-ce59a2b6dc52","Type":"ContainerStarted","Data":"ea6de0c0d03062fdfa428a5ef29eb1fee37a44716e5caabbdaeff6f19ea2eb9e"} Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.092171 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:28 crc kubenswrapper[4931]: E1201 15:03:28.092369 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.592338813 +0000 UTC m=+155.018212480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.094526 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:28 crc kubenswrapper[4931]: E1201 15:03:28.095016 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.594996129 +0000 UTC m=+155.020869796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.152126 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p62ld"] Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.153127 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.155268 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.195840 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:28 crc kubenswrapper[4931]: E1201 15:03:28.196040 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.696004964 +0000 UTC m=+155.121878631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.196173 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.196214 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmc5g\" (UniqueName: \"kubernetes.io/projected/eeec19ab-af88-4a66-8414-a15046f37aaf-kube-api-access-wmc5g\") pod \"community-operators-p62ld\" (UID: \"eeec19ab-af88-4a66-8414-a15046f37aaf\") " pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.196254 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeec19ab-af88-4a66-8414-a15046f37aaf-utilities\") pod \"community-operators-p62ld\" (UID: \"eeec19ab-af88-4a66-8414-a15046f37aaf\") " pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.196299 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeec19ab-af88-4a66-8414-a15046f37aaf-catalog-content\") pod \"community-operators-p62ld\" (UID: \"eeec19ab-af88-4a66-8414-a15046f37aaf\") " pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:03:28 crc kubenswrapper[4931]: E1201 15:03:28.196578 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.69657096 +0000 UTC m=+155.122444627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.214188 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p62ld"] Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.297511 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:28 crc kubenswrapper[4931]: E1201 15:03:28.297780 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.79773302 +0000 UTC m=+155.223606697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.297920 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeec19ab-af88-4a66-8414-a15046f37aaf-utilities\") pod \"community-operators-p62ld\" (UID: \"eeec19ab-af88-4a66-8414-a15046f37aaf\") " pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.297965 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeec19ab-af88-4a66-8414-a15046f37aaf-catalog-content\") pod \"community-operators-p62ld\" (UID: \"eeec19ab-af88-4a66-8414-a15046f37aaf\") " pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.298023 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.298060 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmc5g\" (UniqueName: \"kubernetes.io/projected/eeec19ab-af88-4a66-8414-a15046f37aaf-kube-api-access-wmc5g\") pod \"community-operators-p62ld\" (UID: \"eeec19ab-af88-4a66-8414-a15046f37aaf\") " pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:03:28 crc kubenswrapper[4931]: E1201 15:03:28.298432 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.798412149 +0000 UTC m=+155.224285816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.298540 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeec19ab-af88-4a66-8414-a15046f37aaf-utilities\") pod \"community-operators-p62ld\" (UID: \"eeec19ab-af88-4a66-8414-a15046f37aaf\") " pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.298602 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeec19ab-af88-4a66-8414-a15046f37aaf-catalog-content\") pod \"community-operators-p62ld\" (UID: \"eeec19ab-af88-4a66-8414-a15046f37aaf\") " pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.342339 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmc5g\" (UniqueName: \"kubernetes.io/projected/eeec19ab-af88-4a66-8414-a15046f37aaf-kube-api-access-wmc5g\") pod \"community-operators-p62ld\" (UID: \"eeec19ab-af88-4a66-8414-a15046f37aaf\") " pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.343326 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sfgd5"] Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.344406 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.356613 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.359624 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfgd5"] Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.401955 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.402322 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3667d33f-0665-4c2a-bbed-a160c3d48ed9-utilities\") pod \"certified-operators-sfgd5\" (UID: \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\") " pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.402374 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27rkm\" (UniqueName: \"kubernetes.io/projected/3667d33f-0665-4c2a-bbed-a160c3d48ed9-kube-api-access-27rkm\") pod \"certified-operators-sfgd5\" (UID: \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\") " pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.402423 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3667d33f-0665-4c2a-bbed-a160c3d48ed9-catalog-content\") pod \"certified-operators-sfgd5\" (UID: \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\") " pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:03:28 crc kubenswrapper[4931]: E1201 15:03:28.402535 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:28.902515843 +0000 UTC m=+155.328389510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.468648 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.504418 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27rkm\" (UniqueName: \"kubernetes.io/projected/3667d33f-0665-4c2a-bbed-a160c3d48ed9-kube-api-access-27rkm\") pod \"certified-operators-sfgd5\" (UID: \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\") " pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.504479 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3667d33f-0665-4c2a-bbed-a160c3d48ed9-catalog-content\") pod \"certified-operators-sfgd5\" (UID: \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\") " pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.504513 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.504552 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3667d33f-0665-4c2a-bbed-a160c3d48ed9-utilities\") pod \"certified-operators-sfgd5\" (UID: \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\") " pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:03:28 crc kubenswrapper[4931]: E1201 15:03:28.504913 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:29.004899068 +0000 UTC m=+155.430772735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.505077 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3667d33f-0665-4c2a-bbed-a160c3d48ed9-utilities\") pod \"certified-operators-sfgd5\" (UID: \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\") " pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.505202 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3667d33f-0665-4c2a-bbed-a160c3d48ed9-catalog-content\") pod \"certified-operators-sfgd5\" (UID: \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\") " pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.548542 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pmdbm"] Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.549583 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.557310 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27rkm\" (UniqueName: \"kubernetes.io/projected/3667d33f-0665-4c2a-bbed-a160c3d48ed9-kube-api-access-27rkm\") pod \"certified-operators-sfgd5\" (UID: \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\") " pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.558682 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:28 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:28 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:28 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.558758 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.569206 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pmdbm"] Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.605025 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.605553 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a36eb78-7328-4941-be7b-33191ddfb5b5-utilities\") pod \"community-operators-pmdbm\" (UID: \"9a36eb78-7328-4941-be7b-33191ddfb5b5\") " pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.605657 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a36eb78-7328-4941-be7b-33191ddfb5b5-catalog-content\") pod \"community-operators-pmdbm\" (UID: \"9a36eb78-7328-4941-be7b-33191ddfb5b5\") " pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.605745 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbpwf\" (UniqueName: \"kubernetes.io/projected/9a36eb78-7328-4941-be7b-33191ddfb5b5-kube-api-access-xbpwf\") pod \"community-operators-pmdbm\" (UID: \"9a36eb78-7328-4941-be7b-33191ddfb5b5\") " pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:03:28 crc kubenswrapper[4931]: E1201 15:03:28.605901 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:29.105885703 +0000 UTC m=+155.531759370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.661099 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.704811 4931 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.708178 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a36eb78-7328-4941-be7b-33191ddfb5b5-utilities\") pod \"community-operators-pmdbm\" (UID: \"9a36eb78-7328-4941-be7b-33191ddfb5b5\") " pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.708219 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.708242 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a36eb78-7328-4941-be7b-33191ddfb5b5-catalog-content\") pod \"community-operators-pmdbm\" (UID: \"9a36eb78-7328-4941-be7b-33191ddfb5b5\") " pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.708268 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbpwf\" (UniqueName: \"kubernetes.io/projected/9a36eb78-7328-4941-be7b-33191ddfb5b5-kube-api-access-xbpwf\") pod \"community-operators-pmdbm\" (UID: \"9a36eb78-7328-4941-be7b-33191ddfb5b5\") " pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:03:28 crc kubenswrapper[4931]: E1201 15:03:28.708897 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:29.208874255 +0000 UTC m=+155.634747922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.710289 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a36eb78-7328-4941-be7b-33191ddfb5b5-catalog-content\") pod \"community-operators-pmdbm\" (UID: \"9a36eb78-7328-4941-be7b-33191ddfb5b5\") " pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.713834 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a36eb78-7328-4941-be7b-33191ddfb5b5-utilities\") pod \"community-operators-pmdbm\" (UID: \"9a36eb78-7328-4941-be7b-33191ddfb5b5\") " pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.830268 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:28 crc kubenswrapper[4931]: E1201 15:03:28.830515 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:29.330495051 +0000 UTC m=+155.756368718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.830930 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:28 crc kubenswrapper[4931]: E1201 15:03:28.831320 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:29.331312494 +0000 UTC m=+155.757186151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.841598 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbpwf\" (UniqueName: \"kubernetes.io/projected/9a36eb78-7328-4941-be7b-33191ddfb5b5-kube-api-access-xbpwf\") pod \"community-operators-pmdbm\" (UID: \"9a36eb78-7328-4941-be7b-33191ddfb5b5\") " pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.866297 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jwhzl"] Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.868374 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.871402 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwhzl"] Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.900143 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.934576 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.934782 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ca715d0-92c9-403d-a09f-e86fbb5c585b-catalog-content\") pod \"certified-operators-jwhzl\" (UID: \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\") " pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.934837 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw6cj\" (UniqueName: \"kubernetes.io/projected/0ca715d0-92c9-403d-a09f-e86fbb5c585b-kube-api-access-lw6cj\") pod \"certified-operators-jwhzl\" (UID: \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\") " pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:03:28 crc kubenswrapper[4931]: I1201 15:03:28.934874 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ca715d0-92c9-403d-a09f-e86fbb5c585b-utilities\") pod \"certified-operators-jwhzl\" (UID: \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\") " pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:03:28 crc kubenswrapper[4931]: E1201 15:03:28.935007 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:29.434990496 +0000 UTC m=+155.860864163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.041228 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ca715d0-92c9-403d-a09f-e86fbb5c585b-catalog-content\") pod \"certified-operators-jwhzl\" (UID: \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\") " pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.041296 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw6cj\" (UniqueName: \"kubernetes.io/projected/0ca715d0-92c9-403d-a09f-e86fbb5c585b-kube-api-access-lw6cj\") pod \"certified-operators-jwhzl\" (UID: \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\") " pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.041331 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.041355 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ca715d0-92c9-403d-a09f-e86fbb5c585b-utilities\") pod \"certified-operators-jwhzl\" (UID: \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\") " pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.042185 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ca715d0-92c9-403d-a09f-e86fbb5c585b-utilities\") pod \"certified-operators-jwhzl\" (UID: \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\") " pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:03:29 crc kubenswrapper[4931]: E1201 15:03:29.042627 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:29.542605851 +0000 UTC m=+155.968479518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.042929 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ca715d0-92c9-403d-a09f-e86fbb5c585b-catalog-content\") pod \"certified-operators-jwhzl\" (UID: \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\") " pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.079585 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" event={"ID":"433fc98e-4157-4260-996c-ce59a2b6dc52","Type":"ContainerStarted","Data":"580a7063cddd5de287970ca92167239f044861905880dae05ead4b6edf72c3a8"} Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.091565 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw6cj\" (UniqueName: \"kubernetes.io/projected/0ca715d0-92c9-403d-a09f-e86fbb5c585b-kube-api-access-lw6cj\") pod \"certified-operators-jwhzl\" (UID: \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\") " pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.142091 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:29 crc kubenswrapper[4931]: E1201 15:03:29.143562 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:29.643544494 +0000 UTC m=+156.069418161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.206706 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p62ld"] Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.207295 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-fwfk5" podStartSLOduration=10.20726635 podStartE2EDuration="10.20726635s" podCreationTimestamp="2025-12-01 15:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:29.203012998 +0000 UTC m=+155.628886665" watchObservedRunningTime="2025-12-01 15:03:29.20726635 +0000 UTC m=+155.633140027" Dec 01 15:03:29 crc kubenswrapper[4931]: W1201 15:03:29.236668 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeec19ab_af88_4a66_8414_a15046f37aaf.slice/crio-bb7119ac85ba7f2340a6faae4363e8bd68ea893d7592bec5f185ffd8ec101bbb WatchSource:0}: Error finding container bb7119ac85ba7f2340a6faae4363e8bd68ea893d7592bec5f185ffd8ec101bbb: Status 404 returned error can't find the container with id bb7119ac85ba7f2340a6faae4363e8bd68ea893d7592bec5f185ffd8ec101bbb Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.236986 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.244127 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:29 crc kubenswrapper[4931]: E1201 15:03:29.244424 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 15:03:29.744412495 +0000 UTC m=+156.170286162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gm6vp" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.352042 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:29 crc kubenswrapper[4931]: E1201 15:03:29.353596 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 15:03:29.85237226 +0000 UTC m=+156.278245927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.401489 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfgd5"] Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.414529 4931 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-01T15:03:28.705200659Z","Handler":null,"Name":""} Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.435525 4931 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.435557 4931 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.454857 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.514308 4931 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.514773 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.577647 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:29 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:29 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:29 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.577717 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.608897 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pmdbm"] Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.813853 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gm6vp\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.860911 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jwhzl"] Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.869001 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 15:03:29 crc kubenswrapper[4931]: W1201 15:03:29.872311 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ca715d0_92c9_403d_a09f_e86fbb5c585b.slice/crio-0f324665963a43451fa5fc9ea7a01b87bddd1376ad1ec4dd45e96f062c948b0d WatchSource:0}: Error finding container 0f324665963a43451fa5fc9ea7a01b87bddd1376ad1ec4dd45e96f062c948b0d: Status 404 returned error can't find the container with id 0f324665963a43451fa5fc9ea7a01b87bddd1376ad1ec4dd45e96f062c948b0d Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.875218 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 15:03:29 crc kubenswrapper[4931]: I1201 15:03:29.962305 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.104799 4931 generic.go:334] "Generic (PLEG): container finished" podID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" containerID="2b9593e08a3f341dcb25dc1168585f9299e5db4f6ae6110b84a7acce4530fff2" exitCode=0 Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.104882 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwhzl" event={"ID":"0ca715d0-92c9-403d-a09f-e86fbb5c585b","Type":"ContainerDied","Data":"2b9593e08a3f341dcb25dc1168585f9299e5db4f6ae6110b84a7acce4530fff2"} Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.104917 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwhzl" event={"ID":"0ca715d0-92c9-403d-a09f-e86fbb5c585b","Type":"ContainerStarted","Data":"0f324665963a43451fa5fc9ea7a01b87bddd1376ad1ec4dd45e96f062c948b0d"} Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.108972 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.116907 4931 generic.go:334] "Generic (PLEG): container finished" podID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" containerID="64636f5b5bf5a856b2d263cb60a88f72767f859e98e4b4eaebc3f1d42e2abbeb" exitCode=0 Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.117208 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfgd5" event={"ID":"3667d33f-0665-4c2a-bbed-a160c3d48ed9","Type":"ContainerDied","Data":"64636f5b5bf5a856b2d263cb60a88f72767f859e98e4b4eaebc3f1d42e2abbeb"} Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.117273 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfgd5" event={"ID":"3667d33f-0665-4c2a-bbed-a160c3d48ed9","Type":"ContainerStarted","Data":"883af56b468fa3697ed258b11906263712bf18789f4f0a4602cb0b51dac58a6e"} Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.171945 4931 generic.go:334] "Generic (PLEG): container finished" podID="9a36eb78-7328-4941-be7b-33191ddfb5b5" containerID="dff08adb234b01ca789b20528f376b0300b943d29d79ea3fac178d1c1ede9336" exitCode=0 Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.172471 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmdbm" event={"ID":"9a36eb78-7328-4941-be7b-33191ddfb5b5","Type":"ContainerDied","Data":"dff08adb234b01ca789b20528f376b0300b943d29d79ea3fac178d1c1ede9336"} Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.172505 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmdbm" event={"ID":"9a36eb78-7328-4941-be7b-33191ddfb5b5","Type":"ContainerStarted","Data":"e65f2563f0f120b9b9d199df36fa6e31cf4647b1bb0d855edd8aee36324fc36e"} Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.222778 4931 generic.go:334] "Generic (PLEG): container finished" podID="eeec19ab-af88-4a66-8414-a15046f37aaf" containerID="e7e4ed2ecd850efb4c288fc7fb3e313509224d677f5f0cc807f59b4ad0edb5f2" exitCode=0 Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.224378 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p62ld" event={"ID":"eeec19ab-af88-4a66-8414-a15046f37aaf","Type":"ContainerDied","Data":"e7e4ed2ecd850efb4c288fc7fb3e313509224d677f5f0cc807f59b4ad0edb5f2"} Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.224424 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p62ld" event={"ID":"eeec19ab-af88-4a66-8414-a15046f37aaf","Type":"ContainerStarted","Data":"bb7119ac85ba7f2340a6faae4363e8bd68ea893d7592bec5f185ffd8ec101bbb"} Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.262966 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.326032 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9vldw"] Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.327459 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.331071 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.341405 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vldw"] Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.376546 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d290cb6-63ef-49e0-8772-c74447b6fcff-catalog-content\") pod \"redhat-marketplace-9vldw\" (UID: \"6d290cb6-63ef-49e0-8772-c74447b6fcff\") " pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.376635 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh22s\" (UniqueName: \"kubernetes.io/projected/6d290cb6-63ef-49e0-8772-c74447b6fcff-kube-api-access-nh22s\") pod \"redhat-marketplace-9vldw\" (UID: \"6d290cb6-63ef-49e0-8772-c74447b6fcff\") " pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.376679 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d290cb6-63ef-49e0-8772-c74447b6fcff-utilities\") pod \"redhat-marketplace-9vldw\" (UID: \"6d290cb6-63ef-49e0-8772-c74447b6fcff\") " pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.477744 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d290cb6-63ef-49e0-8772-c74447b6fcff-catalog-content\") pod \"redhat-marketplace-9vldw\" (UID: \"6d290cb6-63ef-49e0-8772-c74447b6fcff\") " pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.478338 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d290cb6-63ef-49e0-8772-c74447b6fcff-catalog-content\") pod \"redhat-marketplace-9vldw\" (UID: \"6d290cb6-63ef-49e0-8772-c74447b6fcff\") " pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.478455 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh22s\" (UniqueName: \"kubernetes.io/projected/6d290cb6-63ef-49e0-8772-c74447b6fcff-kube-api-access-nh22s\") pod \"redhat-marketplace-9vldw\" (UID: \"6d290cb6-63ef-49e0-8772-c74447b6fcff\") " pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.478938 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d290cb6-63ef-49e0-8772-c74447b6fcff-utilities\") pod \"redhat-marketplace-9vldw\" (UID: \"6d290cb6-63ef-49e0-8772-c74447b6fcff\") " pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.479480 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d290cb6-63ef-49e0-8772-c74447b6fcff-utilities\") pod \"redhat-marketplace-9vldw\" (UID: \"6d290cb6-63ef-49e0-8772-c74447b6fcff\") " pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.484197 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gm6vp"] Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.506797 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh22s\" (UniqueName: \"kubernetes.io/projected/6d290cb6-63ef-49e0-8772-c74447b6fcff-kube-api-access-nh22s\") pod \"redhat-marketplace-9vldw\" (UID: \"6d290cb6-63ef-49e0-8772-c74447b6fcff\") " pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.558310 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:30 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:30 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:30 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.558375 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.662698 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.733546 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j4vb5"] Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.736067 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.745219 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4vb5"] Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.887461 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e430b98-f909-4329-bfe8-2cc67aba88fb-catalog-content\") pod \"redhat-marketplace-j4vb5\" (UID: \"2e430b98-f909-4329-bfe8-2cc67aba88fb\") " pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.887526 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e430b98-f909-4329-bfe8-2cc67aba88fb-utilities\") pod \"redhat-marketplace-j4vb5\" (UID: \"2e430b98-f909-4329-bfe8-2cc67aba88fb\") " pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.887776 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn6fx\" (UniqueName: \"kubernetes.io/projected/2e430b98-f909-4329-bfe8-2cc67aba88fb-kube-api-access-jn6fx\") pod \"redhat-marketplace-j4vb5\" (UID: \"2e430b98-f909-4329-bfe8-2cc67aba88fb\") " pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.968896 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vldw"] Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.991151 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn6fx\" (UniqueName: \"kubernetes.io/projected/2e430b98-f909-4329-bfe8-2cc67aba88fb-kube-api-access-jn6fx\") pod \"redhat-marketplace-j4vb5\" (UID: \"2e430b98-f909-4329-bfe8-2cc67aba88fb\") " pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.991232 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e430b98-f909-4329-bfe8-2cc67aba88fb-catalog-content\") pod \"redhat-marketplace-j4vb5\" (UID: \"2e430b98-f909-4329-bfe8-2cc67aba88fb\") " pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.991266 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e430b98-f909-4329-bfe8-2cc67aba88fb-utilities\") pod \"redhat-marketplace-j4vb5\" (UID: \"2e430b98-f909-4329-bfe8-2cc67aba88fb\") " pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.991766 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e430b98-f909-4329-bfe8-2cc67aba88fb-utilities\") pod \"redhat-marketplace-j4vb5\" (UID: \"2e430b98-f909-4329-bfe8-2cc67aba88fb\") " pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:03:30 crc kubenswrapper[4931]: I1201 15:03:30.991899 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e430b98-f909-4329-bfe8-2cc67aba88fb-catalog-content\") pod \"redhat-marketplace-j4vb5\" (UID: \"2e430b98-f909-4329-bfe8-2cc67aba88fb\") " pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.018490 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn6fx\" (UniqueName: \"kubernetes.io/projected/2e430b98-f909-4329-bfe8-2cc67aba88fb-kube-api-access-jn6fx\") pod \"redhat-marketplace-j4vb5\" (UID: \"2e430b98-f909-4329-bfe8-2cc67aba88fb\") " pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.064588 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.234992 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.237703 4931 generic.go:334] "Generic (PLEG): container finished" podID="d008c5dd-f44f-4509-b705-46b4c8819684" containerID="c592a428e1001ca7c30ec6909d8b6bb5d953909351bd5125d08ce329afa24f5f" exitCode=0 Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.237805 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" event={"ID":"d008c5dd-f44f-4509-b705-46b4c8819684","Type":"ContainerDied","Data":"c592a428e1001ca7c30ec6909d8b6bb5d953909351bd5125d08ce329afa24f5f"} Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.243334 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-2tbqk" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.244738 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" event={"ID":"e2229b8c-268a-46fd-bb3d-442032e330ff","Type":"ContainerStarted","Data":"a9d07650f6bb51c3c56c74320d3ed85565e89c5fccc03b71f2157ce2bc98b2ad"} Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.244769 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" event={"ID":"e2229b8c-268a-46fd-bb3d-442032e330ff","Type":"ContainerStarted","Data":"163a48f10d5caf73a90fb599cce8a55e08bb0b8fdaa40c17a8b219dc7230da34"} Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.244884 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.247772 4931 generic.go:334] "Generic (PLEG): container finished" podID="6d290cb6-63ef-49e0-8772-c74447b6fcff" containerID="b27d9c4336a03ae4746269cb17457b2de493d78317bb797fb0e9dde1ee4e3d90" exitCode=0 Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.247815 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vldw" event={"ID":"6d290cb6-63ef-49e0-8772-c74447b6fcff","Type":"ContainerDied","Data":"b27d9c4336a03ae4746269cb17457b2de493d78317bb797fb0e9dde1ee4e3d90"} Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.247840 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vldw" event={"ID":"6d290cb6-63ef-49e0-8772-c74447b6fcff","Type":"ContainerStarted","Data":"db853f56b468c51526b7c819de767a307a4981de89b4b80715f49f44039979b2"} Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.299282 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.307576 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p2z4b" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.340629 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-47bmf"] Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.341912 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.343243 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" podStartSLOduration=137.343013207 podStartE2EDuration="2m17.343013207s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:31.330819708 +0000 UTC m=+157.756693375" watchObservedRunningTime="2025-12-01 15:03:31.343013207 +0000 UTC m=+157.768886894" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.346713 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.351483 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-47bmf"] Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.356029 4931 patch_prober.go:28] interesting pod/downloads-7954f5f757-zz6cp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.357409 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zz6cp" podUID="af75a1fc-4c7b-4f3b-bc50-9d0fd4d7f52f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.357830 4931 patch_prober.go:28] interesting pod/downloads-7954f5f757-zz6cp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.357856 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zz6cp" podUID="af75a1fc-4c7b-4f3b-bc50-9d0fd4d7f52f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.357948 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.358001 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.369571 4931 patch_prober.go:28] interesting pod/console-f9d7485db-drngn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.369645 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-drngn" podUID="c907e960-f833-4546-89df-491334c4fe72" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.402934 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vch6b\" (UniqueName: \"kubernetes.io/projected/8409825a-f886-4896-9c0c-919d12b3761c-kube-api-access-vch6b\") pod \"redhat-operators-47bmf\" (UID: \"8409825a-f886-4896-9c0c-919d12b3761c\") " pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.403000 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8409825a-f886-4896-9c0c-919d12b3761c-catalog-content\") pod \"redhat-operators-47bmf\" (UID: \"8409825a-f886-4896-9c0c-919d12b3761c\") " pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.403083 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8409825a-f886-4896-9c0c-919d12b3761c-utilities\") pod \"redhat-operators-47bmf\" (UID: \"8409825a-f886-4896-9c0c-919d12b3761c\") " pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.466082 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zkb6c" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.518899 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vch6b\" (UniqueName: \"kubernetes.io/projected/8409825a-f886-4896-9c0c-919d12b3761c-kube-api-access-vch6b\") pod \"redhat-operators-47bmf\" (UID: \"8409825a-f886-4896-9c0c-919d12b3761c\") " pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.518964 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8409825a-f886-4896-9c0c-919d12b3761c-catalog-content\") pod \"redhat-operators-47bmf\" (UID: \"8409825a-f886-4896-9c0c-919d12b3761c\") " pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.519041 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8409825a-f886-4896-9c0c-919d12b3761c-utilities\") pod \"redhat-operators-47bmf\" (UID: \"8409825a-f886-4896-9c0c-919d12b3761c\") " pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.519700 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8409825a-f886-4896-9c0c-919d12b3761c-utilities\") pod \"redhat-operators-47bmf\" (UID: \"8409825a-f886-4896-9c0c-919d12b3761c\") " pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.521172 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8409825a-f886-4896-9c0c-919d12b3761c-catalog-content\") pod \"redhat-operators-47bmf\" (UID: \"8409825a-f886-4896-9c0c-919d12b3761c\") " pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.559986 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.575726 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:31 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:31 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:31 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.575827 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.576348 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.577495 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.579588 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vch6b\" (UniqueName: \"kubernetes.io/projected/8409825a-f886-4896-9c0c-919d12b3761c-kube-api-access-vch6b\") pod \"redhat-operators-47bmf\" (UID: \"8409825a-f886-4896-9c0c-919d12b3761c\") " pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.580313 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.582423 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.583471 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.619997 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/739749a0-ea46-49d1-bf55-14f1e1bf7d21-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"739749a0-ea46-49d1-bf55-14f1e1bf7d21\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.620086 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/739749a0-ea46-49d1-bf55-14f1e1bf7d21-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"739749a0-ea46-49d1-bf55-14f1e1bf7d21\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.635672 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4vb5"] Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.699888 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.724346 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/739749a0-ea46-49d1-bf55-14f1e1bf7d21-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"739749a0-ea46-49d1-bf55-14f1e1bf7d21\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.724424 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/739749a0-ea46-49d1-bf55-14f1e1bf7d21-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"739749a0-ea46-49d1-bf55-14f1e1bf7d21\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.724620 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/739749a0-ea46-49d1-bf55-14f1e1bf7d21-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"739749a0-ea46-49d1-bf55-14f1e1bf7d21\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.735759 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-47ntf"] Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.739195 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.762948 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-47ntf"] Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.773749 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/739749a0-ea46-49d1-bf55-14f1e1bf7d21-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"739749a0-ea46-49d1-bf55-14f1e1bf7d21\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.825849 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-catalog-content\") pod \"redhat-operators-47ntf\" (UID: \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\") " pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.825938 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-utilities\") pod \"redhat-operators-47ntf\" (UID: \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\") " pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.825976 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt6rg\" (UniqueName: \"kubernetes.io/projected/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-kube-api-access-mt6rg\") pod \"redhat-operators-47ntf\" (UID: \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\") " pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.920027 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.928769 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-utilities\") pod \"redhat-operators-47ntf\" (UID: \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\") " pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.928817 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt6rg\" (UniqueName: \"kubernetes.io/projected/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-kube-api-access-mt6rg\") pod \"redhat-operators-47ntf\" (UID: \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\") " pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.928886 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-catalog-content\") pod \"redhat-operators-47ntf\" (UID: \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\") " pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.929299 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-catalog-content\") pod \"redhat-operators-47ntf\" (UID: \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\") " pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.929870 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-utilities\") pod \"redhat-operators-47ntf\" (UID: \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\") " pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:03:31 crc kubenswrapper[4931]: I1201 15:03:31.977453 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt6rg\" (UniqueName: \"kubernetes.io/projected/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-kube-api-access-mt6rg\") pod \"redhat-operators-47ntf\" (UID: \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\") " pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.093443 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.194769 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-47bmf"] Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.302813 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47bmf" event={"ID":"8409825a-f886-4896-9c0c-919d12b3761c","Type":"ContainerStarted","Data":"51f810eca2c2a1797f50ce4a72e1be418914cce8040d6f6316cb69727b7fe63e"} Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.305562 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.316168 4931 generic.go:334] "Generic (PLEG): container finished" podID="2e430b98-f909-4329-bfe8-2cc67aba88fb" containerID="0d7de48e8cb3c0af2c7924fa0a4e24e38e698a6780192d9d645b9f040b6a2755" exitCode=0 Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.319604 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4vb5" event={"ID":"2e430b98-f909-4329-bfe8-2cc67aba88fb","Type":"ContainerDied","Data":"0d7de48e8cb3c0af2c7924fa0a4e24e38e698a6780192d9d645b9f040b6a2755"} Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.319716 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4vb5" event={"ID":"2e430b98-f909-4329-bfe8-2cc67aba88fb","Type":"ContainerStarted","Data":"41b352de93f44d3121655c8ca95cbd2faada189e5337bfbb3f6eedfebb0cb228"} Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.575271 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:32 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:32 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:32 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.575358 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.688035 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.731611 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-47ntf"] Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.763282 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skvvg\" (UniqueName: \"kubernetes.io/projected/d008c5dd-f44f-4509-b705-46b4c8819684-kube-api-access-skvvg\") pod \"d008c5dd-f44f-4509-b705-46b4c8819684\" (UID: \"d008c5dd-f44f-4509-b705-46b4c8819684\") " Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.764094 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d008c5dd-f44f-4509-b705-46b4c8819684-config-volume\") pod \"d008c5dd-f44f-4509-b705-46b4c8819684\" (UID: \"d008c5dd-f44f-4509-b705-46b4c8819684\") " Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.764236 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d008c5dd-f44f-4509-b705-46b4c8819684-secret-volume\") pod \"d008c5dd-f44f-4509-b705-46b4c8819684\" (UID: \"d008c5dd-f44f-4509-b705-46b4c8819684\") " Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.765115 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d008c5dd-f44f-4509-b705-46b4c8819684-config-volume" (OuterVolumeSpecName: "config-volume") pod "d008c5dd-f44f-4509-b705-46b4c8819684" (UID: "d008c5dd-f44f-4509-b705-46b4c8819684"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.765332 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d008c5dd-f44f-4509-b705-46b4c8819684-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.769926 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d008c5dd-f44f-4509-b705-46b4c8819684-kube-api-access-skvvg" (OuterVolumeSpecName: "kube-api-access-skvvg") pod "d008c5dd-f44f-4509-b705-46b4c8819684" (UID: "d008c5dd-f44f-4509-b705-46b4c8819684"). InnerVolumeSpecName "kube-api-access-skvvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.771802 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008c5dd-f44f-4509-b705-46b4c8819684-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d008c5dd-f44f-4509-b705-46b4c8819684" (UID: "d008c5dd-f44f-4509-b705-46b4c8819684"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:03:32 crc kubenswrapper[4931]: W1201 15:03:32.790005 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d5070d6_9ed9_4a75_95a0_c1d57b468c58.slice/crio-316c88921699281b4536f907c993d729f17717b63da0719f561a78404672ae71 WatchSource:0}: Error finding container 316c88921699281b4536f907c993d729f17717b63da0719f561a78404672ae71: Status 404 returned error can't find the container with id 316c88921699281b4536f907c993d729f17717b63da0719f561a78404672ae71 Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.867654 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skvvg\" (UniqueName: \"kubernetes.io/projected/d008c5dd-f44f-4509-b705-46b4c8819684-kube-api-access-skvvg\") on node \"crc\" DevicePath \"\"" Dec 01 15:03:32 crc kubenswrapper[4931]: I1201 15:03:32.867717 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d008c5dd-f44f-4509-b705-46b4c8819684-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.339666 4931 generic.go:334] "Generic (PLEG): container finished" podID="8409825a-f886-4896-9c0c-919d12b3761c" containerID="e1ba5a4f789fad0996cecf5c6b77e655605e7f3e32b9f904253bf23c7ac011cb" exitCode=0 Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.339867 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47bmf" event={"ID":"8409825a-f886-4896-9c0c-919d12b3761c","Type":"ContainerDied","Data":"e1ba5a4f789fad0996cecf5c6b77e655605e7f3e32b9f904253bf23c7ac011cb"} Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.346590 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47ntf" event={"ID":"4d5070d6-9ed9-4a75-95a0-c1d57b468c58","Type":"ContainerDied","Data":"09b50bcd61eb303c9ecf6aabe4fd62f33fcb3158530bb679233498c4f2ae0d6f"} Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.346659 4931 generic.go:334] "Generic (PLEG): container finished" podID="4d5070d6-9ed9-4a75-95a0-c1d57b468c58" containerID="09b50bcd61eb303c9ecf6aabe4fd62f33fcb3158530bb679233498c4f2ae0d6f" exitCode=0 Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.346746 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47ntf" event={"ID":"4d5070d6-9ed9-4a75-95a0-c1d57b468c58","Type":"ContainerStarted","Data":"316c88921699281b4536f907c993d729f17717b63da0719f561a78404672ae71"} Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.375939 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.375935 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl" event={"ID":"d008c5dd-f44f-4509-b705-46b4c8819684","Type":"ContainerDied","Data":"3a8821e9e5d4cc97f5dba915992803f3db2dbb5c3f438d62400cdb09ef7d92c8"} Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.377867 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a8821e9e5d4cc97f5dba915992803f3db2dbb5c3f438d62400cdb09ef7d92c8" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.388561 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"739749a0-ea46-49d1-bf55-14f1e1bf7d21","Type":"ContainerStarted","Data":"f1418aabf23bdf22d364a72dbf642e28a3774b034cb4db334bf9be34ad91533f"} Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.388651 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"739749a0-ea46-49d1-bf55-14f1e1bf7d21","Type":"ContainerStarted","Data":"ce33d963c914aca48c6946fa0be94f92ff68c18fa2e8ade61b016a7719a1e5ea"} Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.411485 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.411457985 podStartE2EDuration="2.411457985s" podCreationTimestamp="2025-12-01 15:03:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:03:33.409200891 +0000 UTC m=+159.835074568" watchObservedRunningTime="2025-12-01 15:03:33.411457985 +0000 UTC m=+159.837331652" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.561773 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:33 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:33 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:33 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.562213 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.562109 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 15:03:33 crc kubenswrapper[4931]: E1201 15:03:33.562890 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d008c5dd-f44f-4509-b705-46b4c8819684" containerName="collect-profiles" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.562904 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d008c5dd-f44f-4509-b705-46b4c8819684" containerName="collect-profiles" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.563029 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="d008c5dd-f44f-4509-b705-46b4c8819684" containerName="collect-profiles" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.563557 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.566953 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.567025 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.575696 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.700124 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52575402-6ccd-41d9-906e-41f21b38e744-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"52575402-6ccd-41d9-906e-41f21b38e744\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.700199 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52575402-6ccd-41d9-906e-41f21b38e744-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"52575402-6ccd-41d9-906e-41f21b38e744\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.802300 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52575402-6ccd-41d9-906e-41f21b38e744-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"52575402-6ccd-41d9-906e-41f21b38e744\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.802398 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52575402-6ccd-41d9-906e-41f21b38e744-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"52575402-6ccd-41d9-906e-41f21b38e744\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.802569 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52575402-6ccd-41d9-906e-41f21b38e744-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"52575402-6ccd-41d9-906e-41f21b38e744\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.835552 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52575402-6ccd-41d9-906e-41f21b38e744-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"52575402-6ccd-41d9-906e-41f21b38e744\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 15:03:33 crc kubenswrapper[4931]: I1201 15:03:33.908294 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 15:03:34 crc kubenswrapper[4931]: I1201 15:03:34.401343 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 15:03:34 crc kubenswrapper[4931]: W1201 15:03:34.415613 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod52575402_6ccd_41d9_906e_41f21b38e744.slice/crio-ec188560ae1f5786dc221660b8ec21ebdf4985f8e80577f545f3263fb9d09a1f WatchSource:0}: Error finding container ec188560ae1f5786dc221660b8ec21ebdf4985f8e80577f545f3263fb9d09a1f: Status 404 returned error can't find the container with id ec188560ae1f5786dc221660b8ec21ebdf4985f8e80577f545f3263fb9d09a1f Dec 01 15:03:34 crc kubenswrapper[4931]: I1201 15:03:34.421110 4931 generic.go:334] "Generic (PLEG): container finished" podID="739749a0-ea46-49d1-bf55-14f1e1bf7d21" containerID="f1418aabf23bdf22d364a72dbf642e28a3774b034cb4db334bf9be34ad91533f" exitCode=0 Dec 01 15:03:34 crc kubenswrapper[4931]: I1201 15:03:34.421156 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"739749a0-ea46-49d1-bf55-14f1e1bf7d21","Type":"ContainerDied","Data":"f1418aabf23bdf22d364a72dbf642e28a3774b034cb4db334bf9be34ad91533f"} Dec 01 15:03:34 crc kubenswrapper[4931]: I1201 15:03:34.542555 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-s2ws6" Dec 01 15:03:34 crc kubenswrapper[4931]: I1201 15:03:34.560825 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:34 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:34 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:34 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:34 crc kubenswrapper[4931]: I1201 15:03:34.560891 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:35 crc kubenswrapper[4931]: I1201 15:03:35.434054 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"52575402-6ccd-41d9-906e-41f21b38e744","Type":"ContainerStarted","Data":"ec188560ae1f5786dc221660b8ec21ebdf4985f8e80577f545f3263fb9d09a1f"} Dec 01 15:03:35 crc kubenswrapper[4931]: I1201 15:03:35.560145 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:35 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:35 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:35 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:35 crc kubenswrapper[4931]: I1201 15:03:35.560281 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.041767 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.062671 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.088717 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e105961-27de-4865-bd7b-44dd04d12034-metrics-certs\") pod \"network-metrics-daemon-78dk9\" (UID: \"2e105961-27de-4865-bd7b-44dd04d12034\") " pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.164052 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/739749a0-ea46-49d1-bf55-14f1e1bf7d21-kube-api-access\") pod \"739749a0-ea46-49d1-bf55-14f1e1bf7d21\" (UID: \"739749a0-ea46-49d1-bf55-14f1e1bf7d21\") " Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.164181 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/739749a0-ea46-49d1-bf55-14f1e1bf7d21-kubelet-dir\") pod \"739749a0-ea46-49d1-bf55-14f1e1bf7d21\" (UID: \"739749a0-ea46-49d1-bf55-14f1e1bf7d21\") " Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.164556 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/739749a0-ea46-49d1-bf55-14f1e1bf7d21-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "739749a0-ea46-49d1-bf55-14f1e1bf7d21" (UID: "739749a0-ea46-49d1-bf55-14f1e1bf7d21"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.168138 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739749a0-ea46-49d1-bf55-14f1e1bf7d21-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "739749a0-ea46-49d1-bf55-14f1e1bf7d21" (UID: "739749a0-ea46-49d1-bf55-14f1e1bf7d21"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.178428 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78dk9" Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.267633 4931 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/739749a0-ea46-49d1-bf55-14f1e1bf7d21-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.267670 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/739749a0-ea46-49d1-bf55-14f1e1bf7d21-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.447625 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"52575402-6ccd-41d9-906e-41f21b38e744","Type":"ContainerStarted","Data":"02b75e13153504d247e6cdff7a9323ceac801ed745d4e6557d4800a5572133b8"} Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.451177 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"739749a0-ea46-49d1-bf55-14f1e1bf7d21","Type":"ContainerDied","Data":"ce33d963c914aca48c6946fa0be94f92ff68c18fa2e8ade61b016a7719a1e5ea"} Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.451223 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce33d963c914aca48c6946fa0be94f92ff68c18fa2e8ade61b016a7719a1e5ea" Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.451281 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.558562 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:36 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:36 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:36 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:36 crc kubenswrapper[4931]: I1201 15:03:36.558887 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:37 crc kubenswrapper[4931]: I1201 15:03:37.479301 4931 generic.go:334] "Generic (PLEG): container finished" podID="52575402-6ccd-41d9-906e-41f21b38e744" containerID="02b75e13153504d247e6cdff7a9323ceac801ed745d4e6557d4800a5572133b8" exitCode=0 Dec 01 15:03:37 crc kubenswrapper[4931]: I1201 15:03:37.479368 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"52575402-6ccd-41d9-906e-41f21b38e744","Type":"ContainerDied","Data":"02b75e13153504d247e6cdff7a9323ceac801ed745d4e6557d4800a5572133b8"} Dec 01 15:03:37 crc kubenswrapper[4931]: I1201 15:03:37.556747 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:37 crc kubenswrapper[4931]: [-]has-synced failed: reason withheld Dec 01 15:03:37 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:37 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:37 crc kubenswrapper[4931]: I1201 15:03:37.556838 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:38 crc kubenswrapper[4931]: I1201 15:03:38.557775 4931 patch_prober.go:28] interesting pod/router-default-5444994796-ckz9k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 15:03:38 crc kubenswrapper[4931]: [+]has-synced ok Dec 01 15:03:38 crc kubenswrapper[4931]: [+]process-running ok Dec 01 15:03:38 crc kubenswrapper[4931]: healthz check failed Dec 01 15:03:38 crc kubenswrapper[4931]: I1201 15:03:38.558201 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-ckz9k" podUID="6d46b073-f023-4090-a6ec-4916356b1e4d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 15:03:39 crc kubenswrapper[4931]: I1201 15:03:39.559030 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:39 crc kubenswrapper[4931]: I1201 15:03:39.563350 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-ckz9k" Dec 01 15:03:41 crc kubenswrapper[4931]: I1201 15:03:41.363614 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-zz6cp" Dec 01 15:03:41 crc kubenswrapper[4931]: I1201 15:03:41.436596 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:41 crc kubenswrapper[4931]: I1201 15:03:41.444036 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:03:43 crc kubenswrapper[4931]: I1201 15:03:43.206643 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 15:03:43 crc kubenswrapper[4931]: I1201 15:03:43.305012 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52575402-6ccd-41d9-906e-41f21b38e744-kube-api-access\") pod \"52575402-6ccd-41d9-906e-41f21b38e744\" (UID: \"52575402-6ccd-41d9-906e-41f21b38e744\") " Dec 01 15:03:43 crc kubenswrapper[4931]: I1201 15:03:43.305123 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52575402-6ccd-41d9-906e-41f21b38e744-kubelet-dir\") pod \"52575402-6ccd-41d9-906e-41f21b38e744\" (UID: \"52575402-6ccd-41d9-906e-41f21b38e744\") " Dec 01 15:03:43 crc kubenswrapper[4931]: I1201 15:03:43.305247 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52575402-6ccd-41d9-906e-41f21b38e744-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "52575402-6ccd-41d9-906e-41f21b38e744" (UID: "52575402-6ccd-41d9-906e-41f21b38e744"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:03:43 crc kubenswrapper[4931]: I1201 15:03:43.305572 4931 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52575402-6ccd-41d9-906e-41f21b38e744-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 15:03:43 crc kubenswrapper[4931]: I1201 15:03:43.321189 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52575402-6ccd-41d9-906e-41f21b38e744-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "52575402-6ccd-41d9-906e-41f21b38e744" (UID: "52575402-6ccd-41d9-906e-41f21b38e744"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:03:43 crc kubenswrapper[4931]: I1201 15:03:43.407011 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/52575402-6ccd-41d9-906e-41f21b38e744-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 15:03:43 crc kubenswrapper[4931]: I1201 15:03:43.516886 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"52575402-6ccd-41d9-906e-41f21b38e744","Type":"ContainerDied","Data":"ec188560ae1f5786dc221660b8ec21ebdf4985f8e80577f545f3263fb9d09a1f"} Dec 01 15:03:43 crc kubenswrapper[4931]: I1201 15:03:43.516939 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec188560ae1f5786dc221660b8ec21ebdf4985f8e80577f545f3263fb9d09a1f" Dec 01 15:03:43 crc kubenswrapper[4931]: I1201 15:03:43.516989 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 15:03:49 crc kubenswrapper[4931]: I1201 15:03:49.872122 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:03:49 crc kubenswrapper[4931]: I1201 15:03:49.873039 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:03:49 crc kubenswrapper[4931]: I1201 15:03:49.969734 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:03:50 crc kubenswrapper[4931]: I1201 15:03:50.295251 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 15:04:01 crc kubenswrapper[4931]: E1201 15:04:01.152341 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 15:04:01 crc kubenswrapper[4931]: E1201 15:04:01.153080 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lw6cj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jwhzl_openshift-marketplace(0ca715d0-92c9-403d-a09f-e86fbb5c585b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 15:04:01 crc kubenswrapper[4931]: E1201 15:04:01.154350 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jwhzl" podUID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" Dec 01 15:04:01 crc kubenswrapper[4931]: E1201 15:04:01.171867 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 15:04:01 crc kubenswrapper[4931]: E1201 15:04:01.171996 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-27rkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-sfgd5_openshift-marketplace(3667d33f-0665-4c2a-bbed-a160c3d48ed9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 15:04:01 crc kubenswrapper[4931]: E1201 15:04:01.173226 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-sfgd5" podUID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" Dec 01 15:04:01 crc kubenswrapper[4931]: I1201 15:04:01.933061 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jhppc" Dec 01 15:04:01 crc kubenswrapper[4931]: E1201 15:04:01.942798 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-sfgd5" podUID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" Dec 01 15:04:01 crc kubenswrapper[4931]: E1201 15:04:01.943103 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jwhzl" podUID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" Dec 01 15:04:02 crc kubenswrapper[4931]: E1201 15:04:02.080257 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 15:04:02 crc kubenswrapper[4931]: E1201 15:04:02.080606 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nh22s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9vldw_openshift-marketplace(6d290cb6-63ef-49e0-8772-c74447b6fcff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 15:04:02 crc kubenswrapper[4931]: E1201 15:04:02.081865 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9vldw" podUID="6d290cb6-63ef-49e0-8772-c74447b6fcff" Dec 01 15:04:04 crc kubenswrapper[4931]: E1201 15:04:04.587135 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9vldw" podUID="6d290cb6-63ef-49e0-8772-c74447b6fcff" Dec 01 15:04:04 crc kubenswrapper[4931]: E1201 15:04:04.659218 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 15:04:04 crc kubenswrapper[4931]: E1201 15:04:04.659658 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wmc5g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-p62ld_openshift-marketplace(eeec19ab-af88-4a66-8414-a15046f37aaf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 15:04:04 crc kubenswrapper[4931]: E1201 15:04:04.661148 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-p62ld" podUID="eeec19ab-af88-4a66-8414-a15046f37aaf" Dec 01 15:04:04 crc kubenswrapper[4931]: E1201 15:04:04.685783 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 15:04:04 crc kubenswrapper[4931]: E1201 15:04:04.686001 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn6fx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-j4vb5_openshift-marketplace(2e430b98-f909-4329-bfe8-2cc67aba88fb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 15:04:04 crc kubenswrapper[4931]: E1201 15:04:04.687268 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-j4vb5" podUID="2e430b98-f909-4329-bfe8-2cc67aba88fb" Dec 01 15:04:04 crc kubenswrapper[4931]: E1201 15:04:04.696635 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 15:04:04 crc kubenswrapper[4931]: E1201 15:04:04.696814 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xbpwf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-pmdbm_openshift-marketplace(9a36eb78-7328-4941-be7b-33191ddfb5b5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 15:04:04 crc kubenswrapper[4931]: E1201 15:04:04.698033 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-pmdbm" podUID="9a36eb78-7328-4941-be7b-33191ddfb5b5" Dec 01 15:04:04 crc kubenswrapper[4931]: I1201 15:04:04.974935 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-78dk9"] Dec 01 15:04:08 crc kubenswrapper[4931]: E1201 15:04:08.100485 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-p62ld" podUID="eeec19ab-af88-4a66-8414-a15046f37aaf" Dec 01 15:04:08 crc kubenswrapper[4931]: E1201 15:04:08.100519 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-j4vb5" podUID="2e430b98-f909-4329-bfe8-2cc67aba88fb" Dec 01 15:04:08 crc kubenswrapper[4931]: E1201 15:04:08.100823 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-pmdbm" podUID="9a36eb78-7328-4941-be7b-33191ddfb5b5" Dec 01 15:04:08 crc kubenswrapper[4931]: W1201 15:04:08.102822 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e105961_27de_4865_bd7b_44dd04d12034.slice/crio-72746989e2c2b77dade5e1c34875c425aaf6389fed1749511972d1b714de2e38 WatchSource:0}: Error finding container 72746989e2c2b77dade5e1c34875c425aaf6389fed1749511972d1b714de2e38: Status 404 returned error can't find the container with id 72746989e2c2b77dade5e1c34875c425aaf6389fed1749511972d1b714de2e38 Dec 01 15:04:08 crc kubenswrapper[4931]: I1201 15:04:08.658366 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47bmf" event={"ID":"8409825a-f886-4896-9c0c-919d12b3761c","Type":"ContainerStarted","Data":"433449a5079cd5eabe25db8c690cccdac271038428004b87d57deaa093955913"} Dec 01 15:04:08 crc kubenswrapper[4931]: I1201 15:04:08.670426 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47ntf" event={"ID":"4d5070d6-9ed9-4a75-95a0-c1d57b468c58","Type":"ContainerStarted","Data":"b331704129c2c5dd9f4b872c4b88eb288cb01d7ae5b0d41a968de297db74f3b4"} Dec 01 15:04:08 crc kubenswrapper[4931]: I1201 15:04:08.673526 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-78dk9" event={"ID":"2e105961-27de-4865-bd7b-44dd04d12034","Type":"ContainerStarted","Data":"4b5f1872977f84a9b59fcb03157b65a9ee02ed3566cfcf3e00a442400142e9ba"} Dec 01 15:04:08 crc kubenswrapper[4931]: I1201 15:04:08.673587 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-78dk9" event={"ID":"2e105961-27de-4865-bd7b-44dd04d12034","Type":"ContainerStarted","Data":"d53037f68d7ebd8d27f4771b402445b0499018dad5e8995e32a0834df0b57a60"} Dec 01 15:04:08 crc kubenswrapper[4931]: I1201 15:04:08.673616 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-78dk9" event={"ID":"2e105961-27de-4865-bd7b-44dd04d12034","Type":"ContainerStarted","Data":"72746989e2c2b77dade5e1c34875c425aaf6389fed1749511972d1b714de2e38"} Dec 01 15:04:08 crc kubenswrapper[4931]: I1201 15:04:08.709521 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-78dk9" podStartSLOduration=174.709494645 podStartE2EDuration="2m54.709494645s" podCreationTimestamp="2025-12-01 15:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:04:08.708880137 +0000 UTC m=+195.134753814" watchObservedRunningTime="2025-12-01 15:04:08.709494645 +0000 UTC m=+195.135368312" Dec 01 15:04:09 crc kubenswrapper[4931]: I1201 15:04:09.681483 4931 generic.go:334] "Generic (PLEG): container finished" podID="8409825a-f886-4896-9c0c-919d12b3761c" containerID="433449a5079cd5eabe25db8c690cccdac271038428004b87d57deaa093955913" exitCode=0 Dec 01 15:04:09 crc kubenswrapper[4931]: I1201 15:04:09.681677 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47bmf" event={"ID":"8409825a-f886-4896-9c0c-919d12b3761c","Type":"ContainerDied","Data":"433449a5079cd5eabe25db8c690cccdac271038428004b87d57deaa093955913"} Dec 01 15:04:09 crc kubenswrapper[4931]: I1201 15:04:09.686053 4931 generic.go:334] "Generic (PLEG): container finished" podID="4d5070d6-9ed9-4a75-95a0-c1d57b468c58" containerID="b331704129c2c5dd9f4b872c4b88eb288cb01d7ae5b0d41a968de297db74f3b4" exitCode=0 Dec 01 15:04:09 crc kubenswrapper[4931]: I1201 15:04:09.686514 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47ntf" event={"ID":"4d5070d6-9ed9-4a75-95a0-c1d57b468c58","Type":"ContainerDied","Data":"b331704129c2c5dd9f4b872c4b88eb288cb01d7ae5b0d41a968de297db74f3b4"} Dec 01 15:04:10 crc kubenswrapper[4931]: I1201 15:04:10.696745 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47bmf" event={"ID":"8409825a-f886-4896-9c0c-919d12b3761c","Type":"ContainerStarted","Data":"1465d790ef7c2c06790346fd3cae6f9da1b256cd63beff2695d5b49dd359b18f"} Dec 01 15:04:10 crc kubenswrapper[4931]: I1201 15:04:10.722506 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-47bmf" podStartSLOduration=2.825465372 podStartE2EDuration="39.722486573s" podCreationTimestamp="2025-12-01 15:03:31 +0000 UTC" firstStartedPulling="2025-12-01 15:03:33.347219564 +0000 UTC m=+159.773093221" lastFinishedPulling="2025-12-01 15:04:10.244240715 +0000 UTC m=+196.670114422" observedRunningTime="2025-12-01 15:04:10.716704937 +0000 UTC m=+197.142578614" watchObservedRunningTime="2025-12-01 15:04:10.722486573 +0000 UTC m=+197.148360240" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.160427 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 15:04:11 crc kubenswrapper[4931]: E1201 15:04:11.161019 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52575402-6ccd-41d9-906e-41f21b38e744" containerName="pruner" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.161038 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="52575402-6ccd-41d9-906e-41f21b38e744" containerName="pruner" Dec 01 15:04:11 crc kubenswrapper[4931]: E1201 15:04:11.161069 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739749a0-ea46-49d1-bf55-14f1e1bf7d21" containerName="pruner" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.161081 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="739749a0-ea46-49d1-bf55-14f1e1bf7d21" containerName="pruner" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.161219 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="739749a0-ea46-49d1-bf55-14f1e1bf7d21" containerName="pruner" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.161244 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="52575402-6ccd-41d9-906e-41f21b38e744" containerName="pruner" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.161733 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.166664 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.169685 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.174685 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.314003 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14cff8a0-157e-40db-8786-ee000a821787-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"14cff8a0-157e-40db-8786-ee000a821787\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.314143 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14cff8a0-157e-40db-8786-ee000a821787-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"14cff8a0-157e-40db-8786-ee000a821787\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.416028 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14cff8a0-157e-40db-8786-ee000a821787-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"14cff8a0-157e-40db-8786-ee000a821787\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.416533 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14cff8a0-157e-40db-8786-ee000a821787-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"14cff8a0-157e-40db-8786-ee000a821787\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.416175 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14cff8a0-157e-40db-8786-ee000a821787-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"14cff8a0-157e-40db-8786-ee000a821787\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.450999 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14cff8a0-157e-40db-8786-ee000a821787-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"14cff8a0-157e-40db-8786-ee000a821787\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.477892 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.700983 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.701258 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.706684 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47ntf" event={"ID":"4d5070d6-9ed9-4a75-95a0-c1d57b468c58","Type":"ContainerStarted","Data":"2e2dc92d88070c1b0fbdbfca6daeaf5e567b162f8c10c6c6332484d3eee8f168"} Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.732745 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-47ntf" podStartSLOduration=3.347836145 podStartE2EDuration="40.732726959s" podCreationTimestamp="2025-12-01 15:03:31 +0000 UTC" firstStartedPulling="2025-12-01 15:03:33.377644686 +0000 UTC m=+159.803518353" lastFinishedPulling="2025-12-01 15:04:10.7625355 +0000 UTC m=+197.188409167" observedRunningTime="2025-12-01 15:04:11.732296897 +0000 UTC m=+198.158170564" watchObservedRunningTime="2025-12-01 15:04:11.732726959 +0000 UTC m=+198.158600626" Dec 01 15:04:11 crc kubenswrapper[4931]: I1201 15:04:11.913455 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 15:04:11 crc kubenswrapper[4931]: W1201 15:04:11.923278 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod14cff8a0_157e_40db_8786_ee000a821787.slice/crio-d97dfc83e80402e645b5d83bb8a246fda957fc49b7c91ecc0785d0f36c2a3f02 WatchSource:0}: Error finding container d97dfc83e80402e645b5d83bb8a246fda957fc49b7c91ecc0785d0f36c2a3f02: Status 404 returned error can't find the container with id d97dfc83e80402e645b5d83bb8a246fda957fc49b7c91ecc0785d0f36c2a3f02 Dec 01 15:04:12 crc kubenswrapper[4931]: I1201 15:04:12.095271 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:04:12 crc kubenswrapper[4931]: I1201 15:04:12.095368 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:04:12 crc kubenswrapper[4931]: I1201 15:04:12.714110 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"14cff8a0-157e-40db-8786-ee000a821787","Type":"ContainerStarted","Data":"2dc3fb3225d796e5f5d29764ec105ea24e46b57adc12a7377f7e0d697563a297"} Dec 01 15:04:12 crc kubenswrapper[4931]: I1201 15:04:12.714345 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"14cff8a0-157e-40db-8786-ee000a821787","Type":"ContainerStarted","Data":"d97dfc83e80402e645b5d83bb8a246fda957fc49b7c91ecc0785d0f36c2a3f02"} Dec 01 15:04:12 crc kubenswrapper[4931]: I1201 15:04:12.731277 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.7312504199999998 podStartE2EDuration="1.73125042s" podCreationTimestamp="2025-12-01 15:04:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:04:12.727190794 +0000 UTC m=+199.153064481" watchObservedRunningTime="2025-12-01 15:04:12.73125042 +0000 UTC m=+199.157124107" Dec 01 15:04:12 crc kubenswrapper[4931]: I1201 15:04:12.805588 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-47bmf" podUID="8409825a-f886-4896-9c0c-919d12b3761c" containerName="registry-server" probeResult="failure" output=< Dec 01 15:04:12 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Dec 01 15:04:12 crc kubenswrapper[4931]: > Dec 01 15:04:13 crc kubenswrapper[4931]: I1201 15:04:13.130849 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-47ntf" podUID="4d5070d6-9ed9-4a75-95a0-c1d57b468c58" containerName="registry-server" probeResult="failure" output=< Dec 01 15:04:13 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Dec 01 15:04:13 crc kubenswrapper[4931]: > Dec 01 15:04:13 crc kubenswrapper[4931]: I1201 15:04:13.721559 4931 generic.go:334] "Generic (PLEG): container finished" podID="14cff8a0-157e-40db-8786-ee000a821787" containerID="2dc3fb3225d796e5f5d29764ec105ea24e46b57adc12a7377f7e0d697563a297" exitCode=0 Dec 01 15:04:13 crc kubenswrapper[4931]: I1201 15:04:13.721678 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"14cff8a0-157e-40db-8786-ee000a821787","Type":"ContainerDied","Data":"2dc3fb3225d796e5f5d29764ec105ea24e46b57adc12a7377f7e0d697563a297"} Dec 01 15:04:15 crc kubenswrapper[4931]: I1201 15:04:15.057219 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 15:04:15 crc kubenswrapper[4931]: I1201 15:04:15.084797 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14cff8a0-157e-40db-8786-ee000a821787-kubelet-dir\") pod \"14cff8a0-157e-40db-8786-ee000a821787\" (UID: \"14cff8a0-157e-40db-8786-ee000a821787\") " Dec 01 15:04:15 crc kubenswrapper[4931]: I1201 15:04:15.084947 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14cff8a0-157e-40db-8786-ee000a821787-kube-api-access\") pod \"14cff8a0-157e-40db-8786-ee000a821787\" (UID: \"14cff8a0-157e-40db-8786-ee000a821787\") " Dec 01 15:04:15 crc kubenswrapper[4931]: I1201 15:04:15.084957 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14cff8a0-157e-40db-8786-ee000a821787-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "14cff8a0-157e-40db-8786-ee000a821787" (UID: "14cff8a0-157e-40db-8786-ee000a821787"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:04:15 crc kubenswrapper[4931]: I1201 15:04:15.085106 4931 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14cff8a0-157e-40db-8786-ee000a821787-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:15 crc kubenswrapper[4931]: I1201 15:04:15.097773 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14cff8a0-157e-40db-8786-ee000a821787-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "14cff8a0-157e-40db-8786-ee000a821787" (UID: "14cff8a0-157e-40db-8786-ee000a821787"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:04:15 crc kubenswrapper[4931]: I1201 15:04:15.186071 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14cff8a0-157e-40db-8786-ee000a821787-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:15 crc kubenswrapper[4931]: I1201 15:04:15.738538 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"14cff8a0-157e-40db-8786-ee000a821787","Type":"ContainerDied","Data":"d97dfc83e80402e645b5d83bb8a246fda957fc49b7c91ecc0785d0f36c2a3f02"} Dec 01 15:04:15 crc kubenswrapper[4931]: I1201 15:04:15.738587 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d97dfc83e80402e645b5d83bb8a246fda957fc49b7c91ecc0785d0f36c2a3f02" Dec 01 15:04:15 crc kubenswrapper[4931]: I1201 15:04:15.738656 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.361311 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 15:04:16 crc kubenswrapper[4931]: E1201 15:04:16.362026 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cff8a0-157e-40db-8786-ee000a821787" containerName="pruner" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.362042 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cff8a0-157e-40db-8786-ee000a821787" containerName="pruner" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.362172 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cff8a0-157e-40db-8786-ee000a821787" containerName="pruner" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.362646 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.365173 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.365446 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.377621 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.503564 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eee31af-f548-4846-ae56-affc50023793-kube-api-access\") pod \"installer-9-crc\" (UID: \"3eee31af-f548-4846-ae56-affc50023793\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.503704 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eee31af-f548-4846-ae56-affc50023793-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3eee31af-f548-4846-ae56-affc50023793\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.503741 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3eee31af-f548-4846-ae56-affc50023793-var-lock\") pod \"installer-9-crc\" (UID: \"3eee31af-f548-4846-ae56-affc50023793\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.604573 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eee31af-f548-4846-ae56-affc50023793-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3eee31af-f548-4846-ae56-affc50023793\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.604648 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3eee31af-f548-4846-ae56-affc50023793-var-lock\") pod \"installer-9-crc\" (UID: \"3eee31af-f548-4846-ae56-affc50023793\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.604768 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eee31af-f548-4846-ae56-affc50023793-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3eee31af-f548-4846-ae56-affc50023793\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.604862 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3eee31af-f548-4846-ae56-affc50023793-var-lock\") pod \"installer-9-crc\" (UID: \"3eee31af-f548-4846-ae56-affc50023793\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.604790 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eee31af-f548-4846-ae56-affc50023793-kube-api-access\") pod \"installer-9-crc\" (UID: \"3eee31af-f548-4846-ae56-affc50023793\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.630874 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eee31af-f548-4846-ae56-affc50023793-kube-api-access\") pod \"installer-9-crc\" (UID: \"3eee31af-f548-4846-ae56-affc50023793\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.691431 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.747205 4931 generic.go:334] "Generic (PLEG): container finished" podID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" containerID="9312010f30fbba436199314b9047d486b3904a8e7fc8bf46c0907841b11f067d" exitCode=0 Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.747704 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwhzl" event={"ID":"0ca715d0-92c9-403d-a09f-e86fbb5c585b","Type":"ContainerDied","Data":"9312010f30fbba436199314b9047d486b3904a8e7fc8bf46c0907841b11f067d"} Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.755240 4931 generic.go:334] "Generic (PLEG): container finished" podID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" containerID="8bd48e8c46d602df9a36f0cfc421daf5a44c41924cbe4452647150fe95e01be0" exitCode=0 Dec 01 15:04:16 crc kubenswrapper[4931]: I1201 15:04:16.755307 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfgd5" event={"ID":"3667d33f-0665-4c2a-bbed-a160c3d48ed9","Type":"ContainerDied","Data":"8bd48e8c46d602df9a36f0cfc421daf5a44c41924cbe4452647150fe95e01be0"} Dec 01 15:04:17 crc kubenswrapper[4931]: I1201 15:04:17.396468 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 15:04:17 crc kubenswrapper[4931]: W1201 15:04:17.400166 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3eee31af_f548_4846_ae56_affc50023793.slice/crio-cc1f8c18d0238f8d2af931d0bec408ca9c508b9b8cd4a96b206faafd46bac800 WatchSource:0}: Error finding container cc1f8c18d0238f8d2af931d0bec408ca9c508b9b8cd4a96b206faafd46bac800: Status 404 returned error can't find the container with id cc1f8c18d0238f8d2af931d0bec408ca9c508b9b8cd4a96b206faafd46bac800 Dec 01 15:04:17 crc kubenswrapper[4931]: I1201 15:04:17.761667 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3eee31af-f548-4846-ae56-affc50023793","Type":"ContainerStarted","Data":"ff4bbc01e3e50e66ae67e2c03aeb442dfd8a115ae5642a4b13e54e80a4015f28"} Dec 01 15:04:17 crc kubenswrapper[4931]: I1201 15:04:17.762052 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3eee31af-f548-4846-ae56-affc50023793","Type":"ContainerStarted","Data":"cc1f8c18d0238f8d2af931d0bec408ca9c508b9b8cd4a96b206faafd46bac800"} Dec 01 15:04:17 crc kubenswrapper[4931]: I1201 15:04:17.765232 4931 generic.go:334] "Generic (PLEG): container finished" podID="6d290cb6-63ef-49e0-8772-c74447b6fcff" containerID="514501bdcfd5b8e8ec60f6c82e231d32a0c42738849e2284e7e45cc580d2fcc3" exitCode=0 Dec 01 15:04:17 crc kubenswrapper[4931]: I1201 15:04:17.765263 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vldw" event={"ID":"6d290cb6-63ef-49e0-8772-c74447b6fcff","Type":"ContainerDied","Data":"514501bdcfd5b8e8ec60f6c82e231d32a0c42738849e2284e7e45cc580d2fcc3"} Dec 01 15:04:17 crc kubenswrapper[4931]: I1201 15:04:17.779039 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.779012894 podStartE2EDuration="1.779012894s" podCreationTimestamp="2025-12-01 15:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:04:17.777022817 +0000 UTC m=+204.202896484" watchObservedRunningTime="2025-12-01 15:04:17.779012894 +0000 UTC m=+204.204886561" Dec 01 15:04:18 crc kubenswrapper[4931]: I1201 15:04:18.774065 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfgd5" event={"ID":"3667d33f-0665-4c2a-bbed-a160c3d48ed9","Type":"ContainerStarted","Data":"c29ec789d001949820e8c5fbc81e726543466410fd2d7bb467285ebdda30965b"} Dec 01 15:04:18 crc kubenswrapper[4931]: I1201 15:04:18.783841 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vldw" event={"ID":"6d290cb6-63ef-49e0-8772-c74447b6fcff","Type":"ContainerStarted","Data":"94895ea75731a7a0bce9173168e2cdf8f66df1bf6c07719ad4f13132cba35795"} Dec 01 15:04:18 crc kubenswrapper[4931]: I1201 15:04:18.786437 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwhzl" event={"ID":"0ca715d0-92c9-403d-a09f-e86fbb5c585b","Type":"ContainerStarted","Data":"bd2acdee85c1a5e49a117abe7ab977dd91fe3c82208f2afa953d422e497eaf0f"} Dec 01 15:04:18 crc kubenswrapper[4931]: I1201 15:04:18.802821 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sfgd5" podStartSLOduration=2.292023404 podStartE2EDuration="50.802793339s" podCreationTimestamp="2025-12-01 15:03:28 +0000 UTC" firstStartedPulling="2025-12-01 15:03:30.128804425 +0000 UTC m=+156.554678092" lastFinishedPulling="2025-12-01 15:04:18.63957437 +0000 UTC m=+205.065448027" observedRunningTime="2025-12-01 15:04:18.801197433 +0000 UTC m=+205.227071130" watchObservedRunningTime="2025-12-01 15:04:18.802793339 +0000 UTC m=+205.228667046" Dec 01 15:04:18 crc kubenswrapper[4931]: I1201 15:04:18.854249 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9vldw" podStartSLOduration=1.813408551 podStartE2EDuration="48.854214063s" podCreationTimestamp="2025-12-01 15:03:30 +0000 UTC" firstStartedPulling="2025-12-01 15:03:31.252290287 +0000 UTC m=+157.678163954" lastFinishedPulling="2025-12-01 15:04:18.293095759 +0000 UTC m=+204.718969466" observedRunningTime="2025-12-01 15:04:18.830957126 +0000 UTC m=+205.256830803" watchObservedRunningTime="2025-12-01 15:04:18.854214063 +0000 UTC m=+205.280087770" Dec 01 15:04:18 crc kubenswrapper[4931]: I1201 15:04:18.860130 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jwhzl" podStartSLOduration=3.36402987 podStartE2EDuration="50.860094051s" podCreationTimestamp="2025-12-01 15:03:28 +0000 UTC" firstStartedPulling="2025-12-01 15:03:30.108623036 +0000 UTC m=+156.534496703" lastFinishedPulling="2025-12-01 15:04:17.604687227 +0000 UTC m=+204.030560884" observedRunningTime="2025-12-01 15:04:18.856823567 +0000 UTC m=+205.282697234" watchObservedRunningTime="2025-12-01 15:04:18.860094051 +0000 UTC m=+205.285967758" Dec 01 15:04:19 crc kubenswrapper[4931]: I1201 15:04:19.238355 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:04:19 crc kubenswrapper[4931]: I1201 15:04:19.238518 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:04:19 crc kubenswrapper[4931]: I1201 15:04:19.871830 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:04:19 crc kubenswrapper[4931]: I1201 15:04:19.871930 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:04:19 crc kubenswrapper[4931]: I1201 15:04:19.872003 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:04:19 crc kubenswrapper[4931]: I1201 15:04:19.872876 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 15:04:19 crc kubenswrapper[4931]: I1201 15:04:19.873043 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6" gracePeriod=600 Dec 01 15:04:20 crc kubenswrapper[4931]: I1201 15:04:20.281877 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-jwhzl" podUID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" containerName="registry-server" probeResult="failure" output=< Dec 01 15:04:20 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Dec 01 15:04:20 crc kubenswrapper[4931]: > Dec 01 15:04:20 crc kubenswrapper[4931]: I1201 15:04:20.663046 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:04:20 crc kubenswrapper[4931]: I1201 15:04:20.663402 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:04:21 crc kubenswrapper[4931]: I1201 15:04:21.714981 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-9vldw" podUID="6d290cb6-63ef-49e0-8772-c74447b6fcff" containerName="registry-server" probeResult="failure" output=< Dec 01 15:04:21 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Dec 01 15:04:21 crc kubenswrapper[4931]: > Dec 01 15:04:22 crc kubenswrapper[4931]: I1201 15:04:22.575147 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:04:22 crc kubenswrapper[4931]: I1201 15:04:22.578315 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:04:22 crc kubenswrapper[4931]: I1201 15:04:22.623764 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l2v4w"] Dec 01 15:04:22 crc kubenswrapper[4931]: I1201 15:04:22.625087 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:04:22 crc kubenswrapper[4931]: I1201 15:04:22.643054 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:04:22 crc kubenswrapper[4931]: I1201 15:04:22.811367 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6" exitCode=0 Dec 01 15:04:22 crc kubenswrapper[4931]: I1201 15:04:22.811435 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6"} Dec 01 15:04:23 crc kubenswrapper[4931]: I1201 15:04:23.820868 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"f7b3bdd82a8534c77d3ea7e5ad5dd0c29ea4ba55c0d43ad658e7866a1c0c4265"} Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.532746 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-47ntf"] Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.533694 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-47ntf" podUID="4d5070d6-9ed9-4a75-95a0-c1d57b468c58" containerName="registry-server" containerID="cri-o://2e2dc92d88070c1b0fbdbfca6daeaf5e567b162f8c10c6c6332484d3eee8f168" gracePeriod=2 Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.834572 4931 generic.go:334] "Generic (PLEG): container finished" podID="2e430b98-f909-4329-bfe8-2cc67aba88fb" containerID="09d7307ed2591dc8e1a76e682111c4834d1bc3aa9e96f53e083a5c6361be1102" exitCode=0 Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.834676 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4vb5" event={"ID":"2e430b98-f909-4329-bfe8-2cc67aba88fb","Type":"ContainerDied","Data":"09d7307ed2591dc8e1a76e682111c4834d1bc3aa9e96f53e083a5c6361be1102"} Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.839109 4931 generic.go:334] "Generic (PLEG): container finished" podID="eeec19ab-af88-4a66-8414-a15046f37aaf" containerID="57f6ada2f3f43414e74b48a29a9c83fecc09b549e6a051e69f564eea9d99baa0" exitCode=0 Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.839194 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p62ld" event={"ID":"eeec19ab-af88-4a66-8414-a15046f37aaf","Type":"ContainerDied","Data":"57f6ada2f3f43414e74b48a29a9c83fecc09b549e6a051e69f564eea9d99baa0"} Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.842029 4931 generic.go:334] "Generic (PLEG): container finished" podID="9a36eb78-7328-4941-be7b-33191ddfb5b5" containerID="4d844875922456842cad04b99c34660d1b564f7281fb12b4c4f621f1501ee075" exitCode=0 Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.842086 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmdbm" event={"ID":"9a36eb78-7328-4941-be7b-33191ddfb5b5","Type":"ContainerDied","Data":"4d844875922456842cad04b99c34660d1b564f7281fb12b4c4f621f1501ee075"} Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.845545 4931 generic.go:334] "Generic (PLEG): container finished" podID="4d5070d6-9ed9-4a75-95a0-c1d57b468c58" containerID="2e2dc92d88070c1b0fbdbfca6daeaf5e567b162f8c10c6c6332484d3eee8f168" exitCode=0 Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.845591 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47ntf" event={"ID":"4d5070d6-9ed9-4a75-95a0-c1d57b468c58","Type":"ContainerDied","Data":"2e2dc92d88070c1b0fbdbfca6daeaf5e567b162f8c10c6c6332484d3eee8f168"} Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.908369 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.932661 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt6rg\" (UniqueName: \"kubernetes.io/projected/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-kube-api-access-mt6rg\") pod \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\" (UID: \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\") " Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.933018 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-utilities\") pod \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\" (UID: \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\") " Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.933087 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-catalog-content\") pod \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\" (UID: \"4d5070d6-9ed9-4a75-95a0-c1d57b468c58\") " Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.933963 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-utilities" (OuterVolumeSpecName: "utilities") pod "4d5070d6-9ed9-4a75-95a0-c1d57b468c58" (UID: "4d5070d6-9ed9-4a75-95a0-c1d57b468c58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:04:25 crc kubenswrapper[4931]: I1201 15:04:25.978604 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-kube-api-access-mt6rg" (OuterVolumeSpecName: "kube-api-access-mt6rg") pod "4d5070d6-9ed9-4a75-95a0-c1d57b468c58" (UID: "4d5070d6-9ed9-4a75-95a0-c1d57b468c58"). InnerVolumeSpecName "kube-api-access-mt6rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.034649 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt6rg\" (UniqueName: \"kubernetes.io/projected/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-kube-api-access-mt6rg\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.034685 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.068315 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d5070d6-9ed9-4a75-95a0-c1d57b468c58" (UID: "4d5070d6-9ed9-4a75-95a0-c1d57b468c58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.135816 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d5070d6-9ed9-4a75-95a0-c1d57b468c58-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.854497 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47ntf" event={"ID":"4d5070d6-9ed9-4a75-95a0-c1d57b468c58","Type":"ContainerDied","Data":"316c88921699281b4536f907c993d729f17717b63da0719f561a78404672ae71"} Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.854897 4931 scope.go:117] "RemoveContainer" containerID="2e2dc92d88070c1b0fbdbfca6daeaf5e567b162f8c10c6c6332484d3eee8f168" Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.854748 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47ntf" Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.861337 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmdbm" event={"ID":"9a36eb78-7328-4941-be7b-33191ddfb5b5","Type":"ContainerStarted","Data":"38369c5516f63519250e5021bc356a5e7aff99bda6677fdc35509d7926c9ef39"} Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.866262 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p62ld" event={"ID":"eeec19ab-af88-4a66-8414-a15046f37aaf","Type":"ContainerStarted","Data":"f93588e600fd06bcb71f02640b4ed8fb923c8911a84f74cb1ef3f71b5e756b61"} Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.870888 4931 scope.go:117] "RemoveContainer" containerID="b331704129c2c5dd9f4b872c4b88eb288cb01d7ae5b0d41a968de297db74f3b4" Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.887676 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pmdbm" podStartSLOduration=2.689590769 podStartE2EDuration="58.887656945s" podCreationTimestamp="2025-12-01 15:03:28 +0000 UTC" firstStartedPulling="2025-12-01 15:03:30.183716699 +0000 UTC m=+156.609590356" lastFinishedPulling="2025-12-01 15:04:26.381782865 +0000 UTC m=+212.807656532" observedRunningTime="2025-12-01 15:04:26.887019337 +0000 UTC m=+213.312893014" watchObservedRunningTime="2025-12-01 15:04:26.887656945 +0000 UTC m=+213.313530612" Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.905808 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-47ntf"] Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.907824 4931 scope.go:117] "RemoveContainer" containerID="09b50bcd61eb303c9ecf6aabe4fd62f33fcb3158530bb679233498c4f2ae0d6f" Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.910153 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-47ntf"] Dec 01 15:04:26 crc kubenswrapper[4931]: I1201 15:04:26.922181 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p62ld" podStartSLOduration=2.650096297 podStartE2EDuration="58.922162985s" podCreationTimestamp="2025-12-01 15:03:28 +0000 UTC" firstStartedPulling="2025-12-01 15:03:30.228589405 +0000 UTC m=+156.654463072" lastFinishedPulling="2025-12-01 15:04:26.500656093 +0000 UTC m=+212.926529760" observedRunningTime="2025-12-01 15:04:26.920192298 +0000 UTC m=+213.346065965" watchObservedRunningTime="2025-12-01 15:04:26.922162985 +0000 UTC m=+213.348036652" Dec 01 15:04:27 crc kubenswrapper[4931]: I1201 15:04:27.874375 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4vb5" event={"ID":"2e430b98-f909-4329-bfe8-2cc67aba88fb","Type":"ContainerStarted","Data":"e302b71acdac75ea2f0ba2e7446bf37bac5c5e39667486f8baff4a1da99da146"} Dec 01 15:04:27 crc kubenswrapper[4931]: I1201 15:04:27.904806 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j4vb5" podStartSLOduration=3.482953935 podStartE2EDuration="57.90478134s" podCreationTimestamp="2025-12-01 15:03:30 +0000 UTC" firstStartedPulling="2025-12-01 15:03:32.320017571 +0000 UTC m=+158.745891238" lastFinishedPulling="2025-12-01 15:04:26.741844976 +0000 UTC m=+213.167718643" observedRunningTime="2025-12-01 15:04:27.901892527 +0000 UTC m=+214.327766204" watchObservedRunningTime="2025-12-01 15:04:27.90478134 +0000 UTC m=+214.330655017" Dec 01 15:04:28 crc kubenswrapper[4931]: I1201 15:04:28.248989 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d5070d6-9ed9-4a75-95a0-c1d57b468c58" path="/var/lib/kubelet/pods/4d5070d6-9ed9-4a75-95a0-c1d57b468c58/volumes" Dec 01 15:04:28 crc kubenswrapper[4931]: I1201 15:04:28.469570 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:04:28 crc kubenswrapper[4931]: I1201 15:04:28.469633 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:04:28 crc kubenswrapper[4931]: I1201 15:04:28.662275 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:04:28 crc kubenswrapper[4931]: I1201 15:04:28.662331 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:04:28 crc kubenswrapper[4931]: I1201 15:04:28.706939 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:04:28 crc kubenswrapper[4931]: I1201 15:04:28.902016 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:04:28 crc kubenswrapper[4931]: I1201 15:04:28.902435 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:04:28 crc kubenswrapper[4931]: I1201 15:04:28.918535 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:04:28 crc kubenswrapper[4931]: I1201 15:04:28.958731 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:04:29 crc kubenswrapper[4931]: I1201 15:04:29.285811 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:04:29 crc kubenswrapper[4931]: I1201 15:04:29.329736 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:04:29 crc kubenswrapper[4931]: I1201 15:04:29.515748 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-p62ld" podUID="eeec19ab-af88-4a66-8414-a15046f37aaf" containerName="registry-server" probeResult="failure" output=< Dec 01 15:04:29 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Dec 01 15:04:29 crc kubenswrapper[4931]: > Dec 01 15:04:30 crc kubenswrapper[4931]: I1201 15:04:30.722797 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:04:30 crc kubenswrapper[4931]: I1201 15:04:30.767254 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:04:31 crc kubenswrapper[4931]: I1201 15:04:31.065578 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:04:31 crc kubenswrapper[4931]: I1201 15:04:31.065909 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:04:31 crc kubenswrapper[4931]: I1201 15:04:31.107693 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:04:31 crc kubenswrapper[4931]: I1201 15:04:31.935319 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:04:32 crc kubenswrapper[4931]: I1201 15:04:32.531702 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwhzl"] Dec 01 15:04:32 crc kubenswrapper[4931]: I1201 15:04:32.531929 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jwhzl" podUID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" containerName="registry-server" containerID="cri-o://bd2acdee85c1a5e49a117abe7ab977dd91fe3c82208f2afa953d422e497eaf0f" gracePeriod=2 Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.422602 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.534863 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw6cj\" (UniqueName: \"kubernetes.io/projected/0ca715d0-92c9-403d-a09f-e86fbb5c585b-kube-api-access-lw6cj\") pod \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\" (UID: \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\") " Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.534946 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ca715d0-92c9-403d-a09f-e86fbb5c585b-utilities\") pod \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\" (UID: \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\") " Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.535006 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ca715d0-92c9-403d-a09f-e86fbb5c585b-catalog-content\") pod \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\" (UID: \"0ca715d0-92c9-403d-a09f-e86fbb5c585b\") " Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.537352 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ca715d0-92c9-403d-a09f-e86fbb5c585b-utilities" (OuterVolumeSpecName: "utilities") pod "0ca715d0-92c9-403d-a09f-e86fbb5c585b" (UID: "0ca715d0-92c9-403d-a09f-e86fbb5c585b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.545270 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ca715d0-92c9-403d-a09f-e86fbb5c585b-kube-api-access-lw6cj" (OuterVolumeSpecName: "kube-api-access-lw6cj") pod "0ca715d0-92c9-403d-a09f-e86fbb5c585b" (UID: "0ca715d0-92c9-403d-a09f-e86fbb5c585b"). InnerVolumeSpecName "kube-api-access-lw6cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.583293 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ca715d0-92c9-403d-a09f-e86fbb5c585b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ca715d0-92c9-403d-a09f-e86fbb5c585b" (UID: "0ca715d0-92c9-403d-a09f-e86fbb5c585b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.637169 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw6cj\" (UniqueName: \"kubernetes.io/projected/0ca715d0-92c9-403d-a09f-e86fbb5c585b-kube-api-access-lw6cj\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.637214 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ca715d0-92c9-403d-a09f-e86fbb5c585b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.637231 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ca715d0-92c9-403d-a09f-e86fbb5c585b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.912701 4931 generic.go:334] "Generic (PLEG): container finished" podID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" containerID="bd2acdee85c1a5e49a117abe7ab977dd91fe3c82208f2afa953d422e497eaf0f" exitCode=0 Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.912757 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwhzl" event={"ID":"0ca715d0-92c9-403d-a09f-e86fbb5c585b","Type":"ContainerDied","Data":"bd2acdee85c1a5e49a117abe7ab977dd91fe3c82208f2afa953d422e497eaf0f"} Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.912796 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jwhzl" event={"ID":"0ca715d0-92c9-403d-a09f-e86fbb5c585b","Type":"ContainerDied","Data":"0f324665963a43451fa5fc9ea7a01b87bddd1376ad1ec4dd45e96f062c948b0d"} Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.912804 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jwhzl" Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.912816 4931 scope.go:117] "RemoveContainer" containerID="bd2acdee85c1a5e49a117abe7ab977dd91fe3c82208f2afa953d422e497eaf0f" Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.935548 4931 scope.go:117] "RemoveContainer" containerID="9312010f30fbba436199314b9047d486b3904a8e7fc8bf46c0907841b11f067d" Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.957956 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jwhzl"] Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.963560 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jwhzl"] Dec 01 15:04:33 crc kubenswrapper[4931]: I1201 15:04:33.988613 4931 scope.go:117] "RemoveContainer" containerID="2b9593e08a3f341dcb25dc1168585f9299e5db4f6ae6110b84a7acce4530fff2" Dec 01 15:04:34 crc kubenswrapper[4931]: I1201 15:04:34.004709 4931 scope.go:117] "RemoveContainer" containerID="bd2acdee85c1a5e49a117abe7ab977dd91fe3c82208f2afa953d422e497eaf0f" Dec 01 15:04:34 crc kubenswrapper[4931]: E1201 15:04:34.005288 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd2acdee85c1a5e49a117abe7ab977dd91fe3c82208f2afa953d422e497eaf0f\": container with ID starting with bd2acdee85c1a5e49a117abe7ab977dd91fe3c82208f2afa953d422e497eaf0f not found: ID does not exist" containerID="bd2acdee85c1a5e49a117abe7ab977dd91fe3c82208f2afa953d422e497eaf0f" Dec 01 15:04:34 crc kubenswrapper[4931]: I1201 15:04:34.005343 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd2acdee85c1a5e49a117abe7ab977dd91fe3c82208f2afa953d422e497eaf0f"} err="failed to get container status \"bd2acdee85c1a5e49a117abe7ab977dd91fe3c82208f2afa953d422e497eaf0f\": rpc error: code = NotFound desc = could not find container \"bd2acdee85c1a5e49a117abe7ab977dd91fe3c82208f2afa953d422e497eaf0f\": container with ID starting with bd2acdee85c1a5e49a117abe7ab977dd91fe3c82208f2afa953d422e497eaf0f not found: ID does not exist" Dec 01 15:04:34 crc kubenswrapper[4931]: I1201 15:04:34.005407 4931 scope.go:117] "RemoveContainer" containerID="9312010f30fbba436199314b9047d486b3904a8e7fc8bf46c0907841b11f067d" Dec 01 15:04:34 crc kubenswrapper[4931]: E1201 15:04:34.006015 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9312010f30fbba436199314b9047d486b3904a8e7fc8bf46c0907841b11f067d\": container with ID starting with 9312010f30fbba436199314b9047d486b3904a8e7fc8bf46c0907841b11f067d not found: ID does not exist" containerID="9312010f30fbba436199314b9047d486b3904a8e7fc8bf46c0907841b11f067d" Dec 01 15:04:34 crc kubenswrapper[4931]: I1201 15:04:34.006057 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9312010f30fbba436199314b9047d486b3904a8e7fc8bf46c0907841b11f067d"} err="failed to get container status \"9312010f30fbba436199314b9047d486b3904a8e7fc8bf46c0907841b11f067d\": rpc error: code = NotFound desc = could not find container \"9312010f30fbba436199314b9047d486b3904a8e7fc8bf46c0907841b11f067d\": container with ID starting with 9312010f30fbba436199314b9047d486b3904a8e7fc8bf46c0907841b11f067d not found: ID does not exist" Dec 01 15:04:34 crc kubenswrapper[4931]: I1201 15:04:34.006082 4931 scope.go:117] "RemoveContainer" containerID="2b9593e08a3f341dcb25dc1168585f9299e5db4f6ae6110b84a7acce4530fff2" Dec 01 15:04:34 crc kubenswrapper[4931]: E1201 15:04:34.006469 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9593e08a3f341dcb25dc1168585f9299e5db4f6ae6110b84a7acce4530fff2\": container with ID starting with 2b9593e08a3f341dcb25dc1168585f9299e5db4f6ae6110b84a7acce4530fff2 not found: ID does not exist" containerID="2b9593e08a3f341dcb25dc1168585f9299e5db4f6ae6110b84a7acce4530fff2" Dec 01 15:04:34 crc kubenswrapper[4931]: I1201 15:04:34.006508 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9593e08a3f341dcb25dc1168585f9299e5db4f6ae6110b84a7acce4530fff2"} err="failed to get container status \"2b9593e08a3f341dcb25dc1168585f9299e5db4f6ae6110b84a7acce4530fff2\": rpc error: code = NotFound desc = could not find container \"2b9593e08a3f341dcb25dc1168585f9299e5db4f6ae6110b84a7acce4530fff2\": container with ID starting with 2b9593e08a3f341dcb25dc1168585f9299e5db4f6ae6110b84a7acce4530fff2 not found: ID does not exist" Dec 01 15:04:34 crc kubenswrapper[4931]: I1201 15:04:34.249365 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" path="/var/lib/kubelet/pods/0ca715d0-92c9-403d-a09f-e86fbb5c585b/volumes" Dec 01 15:04:34 crc kubenswrapper[4931]: I1201 15:04:34.938704 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4vb5"] Dec 01 15:04:34 crc kubenswrapper[4931]: I1201 15:04:34.939496 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j4vb5" podUID="2e430b98-f909-4329-bfe8-2cc67aba88fb" containerName="registry-server" containerID="cri-o://e302b71acdac75ea2f0ba2e7446bf37bac5c5e39667486f8baff4a1da99da146" gracePeriod=2 Dec 01 15:04:35 crc kubenswrapper[4931]: I1201 15:04:35.886346 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.069944 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn6fx\" (UniqueName: \"kubernetes.io/projected/2e430b98-f909-4329-bfe8-2cc67aba88fb-kube-api-access-jn6fx\") pod \"2e430b98-f909-4329-bfe8-2cc67aba88fb\" (UID: \"2e430b98-f909-4329-bfe8-2cc67aba88fb\") " Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.070083 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e430b98-f909-4329-bfe8-2cc67aba88fb-catalog-content\") pod \"2e430b98-f909-4329-bfe8-2cc67aba88fb\" (UID: \"2e430b98-f909-4329-bfe8-2cc67aba88fb\") " Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.070117 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e430b98-f909-4329-bfe8-2cc67aba88fb-utilities\") pod \"2e430b98-f909-4329-bfe8-2cc67aba88fb\" (UID: \"2e430b98-f909-4329-bfe8-2cc67aba88fb\") " Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.072249 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e430b98-f909-4329-bfe8-2cc67aba88fb-utilities" (OuterVolumeSpecName: "utilities") pod "2e430b98-f909-4329-bfe8-2cc67aba88fb" (UID: "2e430b98-f909-4329-bfe8-2cc67aba88fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.078155 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e430b98-f909-4329-bfe8-2cc67aba88fb-kube-api-access-jn6fx" (OuterVolumeSpecName: "kube-api-access-jn6fx") pod "2e430b98-f909-4329-bfe8-2cc67aba88fb" (UID: "2e430b98-f909-4329-bfe8-2cc67aba88fb"). InnerVolumeSpecName "kube-api-access-jn6fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.088676 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e430b98-f909-4329-bfe8-2cc67aba88fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e430b98-f909-4329-bfe8-2cc67aba88fb" (UID: "2e430b98-f909-4329-bfe8-2cc67aba88fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.171902 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn6fx\" (UniqueName: \"kubernetes.io/projected/2e430b98-f909-4329-bfe8-2cc67aba88fb-kube-api-access-jn6fx\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.171957 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e430b98-f909-4329-bfe8-2cc67aba88fb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.171981 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e430b98-f909-4329-bfe8-2cc67aba88fb-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.435122 4931 generic.go:334] "Generic (PLEG): container finished" podID="2e430b98-f909-4329-bfe8-2cc67aba88fb" containerID="e302b71acdac75ea2f0ba2e7446bf37bac5c5e39667486f8baff4a1da99da146" exitCode=0 Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.435181 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4vb5" event={"ID":"2e430b98-f909-4329-bfe8-2cc67aba88fb","Type":"ContainerDied","Data":"e302b71acdac75ea2f0ba2e7446bf37bac5c5e39667486f8baff4a1da99da146"} Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.435263 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4vb5" event={"ID":"2e430b98-f909-4329-bfe8-2cc67aba88fb","Type":"ContainerDied","Data":"41b352de93f44d3121655c8ca95cbd2faada189e5337bfbb3f6eedfebb0cb228"} Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.435278 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4vb5" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.435286 4931 scope.go:117] "RemoveContainer" containerID="e302b71acdac75ea2f0ba2e7446bf37bac5c5e39667486f8baff4a1da99da146" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.464532 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4vb5"] Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.465291 4931 scope.go:117] "RemoveContainer" containerID="09d7307ed2591dc8e1a76e682111c4834d1bc3aa9e96f53e083a5c6361be1102" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.468113 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4vb5"] Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.482160 4931 scope.go:117] "RemoveContainer" containerID="0d7de48e8cb3c0af2c7924fa0a4e24e38e698a6780192d9d645b9f040b6a2755" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.522226 4931 scope.go:117] "RemoveContainer" containerID="e302b71acdac75ea2f0ba2e7446bf37bac5c5e39667486f8baff4a1da99da146" Dec 01 15:04:36 crc kubenswrapper[4931]: E1201 15:04:36.522686 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e302b71acdac75ea2f0ba2e7446bf37bac5c5e39667486f8baff4a1da99da146\": container with ID starting with e302b71acdac75ea2f0ba2e7446bf37bac5c5e39667486f8baff4a1da99da146 not found: ID does not exist" containerID="e302b71acdac75ea2f0ba2e7446bf37bac5c5e39667486f8baff4a1da99da146" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.522735 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e302b71acdac75ea2f0ba2e7446bf37bac5c5e39667486f8baff4a1da99da146"} err="failed to get container status \"e302b71acdac75ea2f0ba2e7446bf37bac5c5e39667486f8baff4a1da99da146\": rpc error: code = NotFound desc = could not find container \"e302b71acdac75ea2f0ba2e7446bf37bac5c5e39667486f8baff4a1da99da146\": container with ID starting with e302b71acdac75ea2f0ba2e7446bf37bac5c5e39667486f8baff4a1da99da146 not found: ID does not exist" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.522761 4931 scope.go:117] "RemoveContainer" containerID="09d7307ed2591dc8e1a76e682111c4834d1bc3aa9e96f53e083a5c6361be1102" Dec 01 15:04:36 crc kubenswrapper[4931]: E1201 15:04:36.523063 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09d7307ed2591dc8e1a76e682111c4834d1bc3aa9e96f53e083a5c6361be1102\": container with ID starting with 09d7307ed2591dc8e1a76e682111c4834d1bc3aa9e96f53e083a5c6361be1102 not found: ID does not exist" containerID="09d7307ed2591dc8e1a76e682111c4834d1bc3aa9e96f53e083a5c6361be1102" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.523079 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09d7307ed2591dc8e1a76e682111c4834d1bc3aa9e96f53e083a5c6361be1102"} err="failed to get container status \"09d7307ed2591dc8e1a76e682111c4834d1bc3aa9e96f53e083a5c6361be1102\": rpc error: code = NotFound desc = could not find container \"09d7307ed2591dc8e1a76e682111c4834d1bc3aa9e96f53e083a5c6361be1102\": container with ID starting with 09d7307ed2591dc8e1a76e682111c4834d1bc3aa9e96f53e083a5c6361be1102 not found: ID does not exist" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.523091 4931 scope.go:117] "RemoveContainer" containerID="0d7de48e8cb3c0af2c7924fa0a4e24e38e698a6780192d9d645b9f040b6a2755" Dec 01 15:04:36 crc kubenswrapper[4931]: E1201 15:04:36.523371 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d7de48e8cb3c0af2c7924fa0a4e24e38e698a6780192d9d645b9f040b6a2755\": container with ID starting with 0d7de48e8cb3c0af2c7924fa0a4e24e38e698a6780192d9d645b9f040b6a2755 not found: ID does not exist" containerID="0d7de48e8cb3c0af2c7924fa0a4e24e38e698a6780192d9d645b9f040b6a2755" Dec 01 15:04:36 crc kubenswrapper[4931]: I1201 15:04:36.523443 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7de48e8cb3c0af2c7924fa0a4e24e38e698a6780192d9d645b9f040b6a2755"} err="failed to get container status \"0d7de48e8cb3c0af2c7924fa0a4e24e38e698a6780192d9d645b9f040b6a2755\": rpc error: code = NotFound desc = could not find container \"0d7de48e8cb3c0af2c7924fa0a4e24e38e698a6780192d9d645b9f040b6a2755\": container with ID starting with 0d7de48e8cb3c0af2c7924fa0a4e24e38e698a6780192d9d645b9f040b6a2755 not found: ID does not exist" Dec 01 15:04:38 crc kubenswrapper[4931]: I1201 15:04:38.247930 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e430b98-f909-4329-bfe8-2cc67aba88fb" path="/var/lib/kubelet/pods/2e430b98-f909-4329-bfe8-2cc67aba88fb/volumes" Dec 01 15:04:38 crc kubenswrapper[4931]: I1201 15:04:38.517504 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:04:38 crc kubenswrapper[4931]: I1201 15:04:38.557997 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:04:38 crc kubenswrapper[4931]: I1201 15:04:38.940541 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:04:40 crc kubenswrapper[4931]: I1201 15:04:40.533134 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pmdbm"] Dec 01 15:04:40 crc kubenswrapper[4931]: I1201 15:04:40.533402 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pmdbm" podUID="9a36eb78-7328-4941-be7b-33191ddfb5b5" containerName="registry-server" containerID="cri-o://38369c5516f63519250e5021bc356a5e7aff99bda6677fdc35509d7926c9ef39" gracePeriod=2 Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.039644 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.050424 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbpwf\" (UniqueName: \"kubernetes.io/projected/9a36eb78-7328-4941-be7b-33191ddfb5b5-kube-api-access-xbpwf\") pod \"9a36eb78-7328-4941-be7b-33191ddfb5b5\" (UID: \"9a36eb78-7328-4941-be7b-33191ddfb5b5\") " Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.050477 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a36eb78-7328-4941-be7b-33191ddfb5b5-utilities\") pod \"9a36eb78-7328-4941-be7b-33191ddfb5b5\" (UID: \"9a36eb78-7328-4941-be7b-33191ddfb5b5\") " Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.050564 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a36eb78-7328-4941-be7b-33191ddfb5b5-catalog-content\") pod \"9a36eb78-7328-4941-be7b-33191ddfb5b5\" (UID: \"9a36eb78-7328-4941-be7b-33191ddfb5b5\") " Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.051487 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a36eb78-7328-4941-be7b-33191ddfb5b5-utilities" (OuterVolumeSpecName: "utilities") pod "9a36eb78-7328-4941-be7b-33191ddfb5b5" (UID: "9a36eb78-7328-4941-be7b-33191ddfb5b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.060439 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a36eb78-7328-4941-be7b-33191ddfb5b5-kube-api-access-xbpwf" (OuterVolumeSpecName: "kube-api-access-xbpwf") pod "9a36eb78-7328-4941-be7b-33191ddfb5b5" (UID: "9a36eb78-7328-4941-be7b-33191ddfb5b5"). InnerVolumeSpecName "kube-api-access-xbpwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.108671 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a36eb78-7328-4941-be7b-33191ddfb5b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a36eb78-7328-4941-be7b-33191ddfb5b5" (UID: "9a36eb78-7328-4941-be7b-33191ddfb5b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.151270 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbpwf\" (UniqueName: \"kubernetes.io/projected/9a36eb78-7328-4941-be7b-33191ddfb5b5-kube-api-access-xbpwf\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.151295 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a36eb78-7328-4941-be7b-33191ddfb5b5-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.151306 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a36eb78-7328-4941-be7b-33191ddfb5b5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.472354 4931 generic.go:334] "Generic (PLEG): container finished" podID="9a36eb78-7328-4941-be7b-33191ddfb5b5" containerID="38369c5516f63519250e5021bc356a5e7aff99bda6677fdc35509d7926c9ef39" exitCode=0 Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.472426 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmdbm" event={"ID":"9a36eb78-7328-4941-be7b-33191ddfb5b5","Type":"ContainerDied","Data":"38369c5516f63519250e5021bc356a5e7aff99bda6677fdc35509d7926c9ef39"} Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.472462 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmdbm" event={"ID":"9a36eb78-7328-4941-be7b-33191ddfb5b5","Type":"ContainerDied","Data":"e65f2563f0f120b9b9d199df36fa6e31cf4647b1bb0d855edd8aee36324fc36e"} Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.472483 4931 scope.go:117] "RemoveContainer" containerID="38369c5516f63519250e5021bc356a5e7aff99bda6677fdc35509d7926c9ef39" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.472484 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmdbm" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.487988 4931 scope.go:117] "RemoveContainer" containerID="4d844875922456842cad04b99c34660d1b564f7281fb12b4c4f621f1501ee075" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.498135 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pmdbm"] Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.501051 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pmdbm"] Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.508063 4931 scope.go:117] "RemoveContainer" containerID="dff08adb234b01ca789b20528f376b0300b943d29d79ea3fac178d1c1ede9336" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.529937 4931 scope.go:117] "RemoveContainer" containerID="38369c5516f63519250e5021bc356a5e7aff99bda6677fdc35509d7926c9ef39" Dec 01 15:04:42 crc kubenswrapper[4931]: E1201 15:04:42.530841 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38369c5516f63519250e5021bc356a5e7aff99bda6677fdc35509d7926c9ef39\": container with ID starting with 38369c5516f63519250e5021bc356a5e7aff99bda6677fdc35509d7926c9ef39 not found: ID does not exist" containerID="38369c5516f63519250e5021bc356a5e7aff99bda6677fdc35509d7926c9ef39" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.530904 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38369c5516f63519250e5021bc356a5e7aff99bda6677fdc35509d7926c9ef39"} err="failed to get container status \"38369c5516f63519250e5021bc356a5e7aff99bda6677fdc35509d7926c9ef39\": rpc error: code = NotFound desc = could not find container \"38369c5516f63519250e5021bc356a5e7aff99bda6677fdc35509d7926c9ef39\": container with ID starting with 38369c5516f63519250e5021bc356a5e7aff99bda6677fdc35509d7926c9ef39 not found: ID does not exist" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.530943 4931 scope.go:117] "RemoveContainer" containerID="4d844875922456842cad04b99c34660d1b564f7281fb12b4c4f621f1501ee075" Dec 01 15:04:42 crc kubenswrapper[4931]: E1201 15:04:42.532261 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d844875922456842cad04b99c34660d1b564f7281fb12b4c4f621f1501ee075\": container with ID starting with 4d844875922456842cad04b99c34660d1b564f7281fb12b4c4f621f1501ee075 not found: ID does not exist" containerID="4d844875922456842cad04b99c34660d1b564f7281fb12b4c4f621f1501ee075" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.532307 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d844875922456842cad04b99c34660d1b564f7281fb12b4c4f621f1501ee075"} err="failed to get container status \"4d844875922456842cad04b99c34660d1b564f7281fb12b4c4f621f1501ee075\": rpc error: code = NotFound desc = could not find container \"4d844875922456842cad04b99c34660d1b564f7281fb12b4c4f621f1501ee075\": container with ID starting with 4d844875922456842cad04b99c34660d1b564f7281fb12b4c4f621f1501ee075 not found: ID does not exist" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.532336 4931 scope.go:117] "RemoveContainer" containerID="dff08adb234b01ca789b20528f376b0300b943d29d79ea3fac178d1c1ede9336" Dec 01 15:04:42 crc kubenswrapper[4931]: E1201 15:04:42.533270 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dff08adb234b01ca789b20528f376b0300b943d29d79ea3fac178d1c1ede9336\": container with ID starting with dff08adb234b01ca789b20528f376b0300b943d29d79ea3fac178d1c1ede9336 not found: ID does not exist" containerID="dff08adb234b01ca789b20528f376b0300b943d29d79ea3fac178d1c1ede9336" Dec 01 15:04:42 crc kubenswrapper[4931]: I1201 15:04:42.533332 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff08adb234b01ca789b20528f376b0300b943d29d79ea3fac178d1c1ede9336"} err="failed to get container status \"dff08adb234b01ca789b20528f376b0300b943d29d79ea3fac178d1c1ede9336\": rpc error: code = NotFound desc = could not find container \"dff08adb234b01ca789b20528f376b0300b943d29d79ea3fac178d1c1ede9336\": container with ID starting with dff08adb234b01ca789b20528f376b0300b943d29d79ea3fac178d1c1ede9336 not found: ID does not exist" Dec 01 15:04:44 crc kubenswrapper[4931]: I1201 15:04:44.257560 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a36eb78-7328-4941-be7b-33191ddfb5b5" path="/var/lib/kubelet/pods/9a36eb78-7328-4941-be7b-33191ddfb5b5/volumes" Dec 01 15:04:47 crc kubenswrapper[4931]: I1201 15:04:47.666897 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" podUID="855d1329-52b5-4c28-bef3-b18cb2a5e33e" containerName="oauth-openshift" containerID="cri-o://6255575c2003ba37b60c277f22d1e96f61dbf956ef3c650152646b012f549566" gracePeriod=15 Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.128604 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.132926 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-audit-policies\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.133932 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.234022 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-serving-cert\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.234165 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-cliconfig\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.234213 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-login\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.234263 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-idp-0-file-data\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.234340 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-provider-selection\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.234449 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-router-certs\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.234588 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmkc8\" (UniqueName: \"kubernetes.io/projected/855d1329-52b5-4c28-bef3-b18cb2a5e33e-kube-api-access-fmkc8\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.234636 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-ocp-branding-template\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.234679 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-error\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.234737 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-trusted-ca-bundle\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.234797 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-service-ca\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.234849 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-session\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.234893 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/855d1329-52b5-4c28-bef3-b18cb2a5e33e-audit-dir\") pod \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\" (UID: \"855d1329-52b5-4c28-bef3-b18cb2a5e33e\") " Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.235295 4931 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.235358 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/855d1329-52b5-4c28-bef3-b18cb2a5e33e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.235833 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.236472 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.237585 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.241738 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.242168 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.242940 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.245260 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.245990 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/855d1329-52b5-4c28-bef3-b18cb2a5e33e-kube-api-access-fmkc8" (OuterVolumeSpecName: "kube-api-access-fmkc8") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "kube-api-access-fmkc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.250759 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.251606 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.251808 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.252371 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "855d1329-52b5-4c28-bef3-b18cb2a5e33e" (UID: "855d1329-52b5-4c28-bef3-b18cb2a5e33e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.339422 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmkc8\" (UniqueName: \"kubernetes.io/projected/855d1329-52b5-4c28-bef3-b18cb2a5e33e-kube-api-access-fmkc8\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.339507 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.339534 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.339557 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.339581 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.339603 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.339625 4931 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/855d1329-52b5-4c28-bef3-b18cb2a5e33e-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.339647 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.339667 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.339689 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.339709 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.339730 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.339754 4931 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/855d1329-52b5-4c28-bef3-b18cb2a5e33e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.519656 4931 generic.go:334] "Generic (PLEG): container finished" podID="855d1329-52b5-4c28-bef3-b18cb2a5e33e" containerID="6255575c2003ba37b60c277f22d1e96f61dbf956ef3c650152646b012f549566" exitCode=0 Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.519730 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" event={"ID":"855d1329-52b5-4c28-bef3-b18cb2a5e33e","Type":"ContainerDied","Data":"6255575c2003ba37b60c277f22d1e96f61dbf956ef3c650152646b012f549566"} Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.519783 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.519823 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-l2v4w" event={"ID":"855d1329-52b5-4c28-bef3-b18cb2a5e33e","Type":"ContainerDied","Data":"b150ffc661e966b0b6b77cd043aed8ff8dac986b16089a16b8948d7839b620ba"} Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.519907 4931 scope.go:117] "RemoveContainer" containerID="6255575c2003ba37b60c277f22d1e96f61dbf956ef3c650152646b012f549566" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.567521 4931 scope.go:117] "RemoveContainer" containerID="6255575c2003ba37b60c277f22d1e96f61dbf956ef3c650152646b012f549566" Dec 01 15:04:48 crc kubenswrapper[4931]: E1201 15:04:48.568271 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6255575c2003ba37b60c277f22d1e96f61dbf956ef3c650152646b012f549566\": container with ID starting with 6255575c2003ba37b60c277f22d1e96f61dbf956ef3c650152646b012f549566 not found: ID does not exist" containerID="6255575c2003ba37b60c277f22d1e96f61dbf956ef3c650152646b012f549566" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.568323 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6255575c2003ba37b60c277f22d1e96f61dbf956ef3c650152646b012f549566"} err="failed to get container status \"6255575c2003ba37b60c277f22d1e96f61dbf956ef3c650152646b012f549566\": rpc error: code = NotFound desc = could not find container \"6255575c2003ba37b60c277f22d1e96f61dbf956ef3c650152646b012f549566\": container with ID starting with 6255575c2003ba37b60c277f22d1e96f61dbf956ef3c650152646b012f549566 not found: ID does not exist" Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.569478 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l2v4w"] Dec 01 15:04:48 crc kubenswrapper[4931]: I1201 15:04:48.576068 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-l2v4w"] Dec 01 15:04:50 crc kubenswrapper[4931]: I1201 15:04:50.252241 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="855d1329-52b5-4c28-bef3-b18cb2a5e33e" path="/var/lib/kubelet/pods/855d1329-52b5-4c28-bef3-b18cb2a5e33e/volumes" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.521287 4931 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.521981 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e430b98-f909-4329-bfe8-2cc67aba88fb" containerName="extract-content" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.521994 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e430b98-f909-4329-bfe8-2cc67aba88fb" containerName="extract-content" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.522005 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e430b98-f909-4329-bfe8-2cc67aba88fb" containerName="registry-server" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522011 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e430b98-f909-4329-bfe8-2cc67aba88fb" containerName="registry-server" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.522030 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a36eb78-7328-4941-be7b-33191ddfb5b5" containerName="extract-content" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522038 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a36eb78-7328-4941-be7b-33191ddfb5b5" containerName="extract-content" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.522047 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a36eb78-7328-4941-be7b-33191ddfb5b5" containerName="registry-server" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522054 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a36eb78-7328-4941-be7b-33191ddfb5b5" containerName="registry-server" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.522062 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e430b98-f909-4329-bfe8-2cc67aba88fb" containerName="extract-utilities" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522068 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e430b98-f909-4329-bfe8-2cc67aba88fb" containerName="extract-utilities" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.522076 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5070d6-9ed9-4a75-95a0-c1d57b468c58" containerName="extract-content" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522082 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5070d6-9ed9-4a75-95a0-c1d57b468c58" containerName="extract-content" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.522092 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" containerName="registry-server" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522098 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" containerName="registry-server" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.522106 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" containerName="extract-content" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522112 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" containerName="extract-content" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.522119 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a36eb78-7328-4941-be7b-33191ddfb5b5" containerName="extract-utilities" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522125 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a36eb78-7328-4941-be7b-33191ddfb5b5" containerName="extract-utilities" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.522134 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5070d6-9ed9-4a75-95a0-c1d57b468c58" containerName="extract-utilities" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522139 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5070d6-9ed9-4a75-95a0-c1d57b468c58" containerName="extract-utilities" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.522146 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855d1329-52b5-4c28-bef3-b18cb2a5e33e" containerName="oauth-openshift" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522153 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="855d1329-52b5-4c28-bef3-b18cb2a5e33e" containerName="oauth-openshift" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.522162 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" containerName="extract-utilities" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522168 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" containerName="extract-utilities" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.522179 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5070d6-9ed9-4a75-95a0-c1d57b468c58" containerName="registry-server" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522185 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5070d6-9ed9-4a75-95a0-c1d57b468c58" containerName="registry-server" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522289 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a36eb78-7328-4941-be7b-33191ddfb5b5" containerName="registry-server" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522303 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5070d6-9ed9-4a75-95a0-c1d57b468c58" containerName="registry-server" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522311 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e430b98-f909-4329-bfe8-2cc67aba88fb" containerName="registry-server" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522319 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ca715d0-92c9-403d-a09f-e86fbb5c585b" containerName="registry-server" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522326 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="855d1329-52b5-4c28-bef3-b18cb2a5e33e" containerName="oauth-openshift" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522719 4931 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522872 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.522971 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55" gracePeriod=15 Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.523112 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7" gracePeriod=15 Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.523169 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf" gracePeriod=15 Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.523143 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c" gracePeriod=15 Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.523326 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a" gracePeriod=15 Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.524316 4931 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.524856 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.524884 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.524905 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.524918 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.524945 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.524963 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.525000 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.525018 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.525045 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.525062 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.525082 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.525098 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.525531 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.525570 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.525591 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.525626 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.525648 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.585165 4931 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.2:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.645240 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.645723 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.645762 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.645855 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.645892 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.645912 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.645968 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.646022 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.747503 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.747602 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.747629 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.747683 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.747738 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.747744 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.747802 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.747760 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.747837 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.747936 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.747990 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.748043 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.748064 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.748088 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.748138 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.747993 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: I1201 15:04:55.886331 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:55 crc kubenswrapper[4931]: W1201 15:04:55.903777 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-bee8d1bbbb4b91661075ecd5ba15007493a8ddaacfded5fc030096680e1de33b WatchSource:0}: Error finding container bee8d1bbbb4b91661075ecd5ba15007493a8ddaacfded5fc030096680e1de33b: Status 404 returned error can't find the container with id bee8d1bbbb4b91661075ecd5ba15007493a8ddaacfded5fc030096680e1de33b Dec 01 15:04:55 crc kubenswrapper[4931]: E1201 15:04:55.906801 4931 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.2:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d1fb6159a12fc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 15:04:55.906030332 +0000 UTC m=+242.331903999,LastTimestamp:2025-12-01 15:04:55.906030332 +0000 UTC m=+242.331903999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 15:04:56 crc kubenswrapper[4931]: E1201 15:04:56.143767 4931 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.2:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d1fb6159a12fc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 15:04:55.906030332 +0000 UTC m=+242.331903999,LastTimestamp:2025-12-01 15:04:55.906030332 +0000 UTC m=+242.331903999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 15:04:56 crc kubenswrapper[4931]: I1201 15:04:56.587178 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 15:04:56 crc kubenswrapper[4931]: I1201 15:04:56.587909 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7" exitCode=0 Dec 01 15:04:56 crc kubenswrapper[4931]: I1201 15:04:56.587932 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c" exitCode=0 Dec 01 15:04:56 crc kubenswrapper[4931]: I1201 15:04:56.587939 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a" exitCode=0 Dec 01 15:04:56 crc kubenswrapper[4931]: I1201 15:04:56.587946 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf" exitCode=2 Dec 01 15:04:56 crc kubenswrapper[4931]: I1201 15:04:56.589506 4931 generic.go:334] "Generic (PLEG): container finished" podID="3eee31af-f548-4846-ae56-affc50023793" containerID="ff4bbc01e3e50e66ae67e2c03aeb442dfd8a115ae5642a4b13e54e80a4015f28" exitCode=0 Dec 01 15:04:56 crc kubenswrapper[4931]: I1201 15:04:56.589640 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3eee31af-f548-4846-ae56-affc50023793","Type":"ContainerDied","Data":"ff4bbc01e3e50e66ae67e2c03aeb442dfd8a115ae5642a4b13e54e80a4015f28"} Dec 01 15:04:56 crc kubenswrapper[4931]: I1201 15:04:56.590803 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5a8f3a581d83a9a876ad7036013f352e24956b06748fa4d3a791b936ee46b581"} Dec 01 15:04:56 crc kubenswrapper[4931]: I1201 15:04:56.590845 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bee8d1bbbb4b91661075ecd5ba15007493a8ddaacfded5fc030096680e1de33b"} Dec 01 15:04:56 crc kubenswrapper[4931]: I1201 15:04:56.591572 4931 status_manager.go:851] "Failed to get status for pod" podUID="3eee31af-f548-4846-ae56-affc50023793" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:56 crc kubenswrapper[4931]: E1201 15:04:56.591850 4931 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.2:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:04:56 crc kubenswrapper[4931]: I1201 15:04:56.592434 4931 status_manager.go:851] "Failed to get status for pod" podUID="3eee31af-f548-4846-ae56-affc50023793" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:57 crc kubenswrapper[4931]: I1201 15:04:57.955002 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 15:04:57 crc kubenswrapper[4931]: I1201 15:04:57.956544 4931 status_manager.go:851] "Failed to get status for pod" podUID="3eee31af-f548-4846-ae56-affc50023793" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.083248 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3eee31af-f548-4846-ae56-affc50023793-var-lock\") pod \"3eee31af-f548-4846-ae56-affc50023793\" (UID: \"3eee31af-f548-4846-ae56-affc50023793\") " Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.083463 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eee31af-f548-4846-ae56-affc50023793-kube-api-access\") pod \"3eee31af-f548-4846-ae56-affc50023793\" (UID: \"3eee31af-f548-4846-ae56-affc50023793\") " Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.083502 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eee31af-f548-4846-ae56-affc50023793-kubelet-dir\") pod \"3eee31af-f548-4846-ae56-affc50023793\" (UID: \"3eee31af-f548-4846-ae56-affc50023793\") " Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.083534 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3eee31af-f548-4846-ae56-affc50023793-var-lock" (OuterVolumeSpecName: "var-lock") pod "3eee31af-f548-4846-ae56-affc50023793" (UID: "3eee31af-f548-4846-ae56-affc50023793"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.083820 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3eee31af-f548-4846-ae56-affc50023793-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3eee31af-f548-4846-ae56-affc50023793" (UID: "3eee31af-f548-4846-ae56-affc50023793"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.084073 4931 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3eee31af-f548-4846-ae56-affc50023793-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.084096 4931 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3eee31af-f548-4846-ae56-affc50023793-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.091158 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eee31af-f548-4846-ae56-affc50023793-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3eee31af-f548-4846-ae56-affc50023793" (UID: "3eee31af-f548-4846-ae56-affc50023793"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.185727 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3eee31af-f548-4846-ae56-affc50023793-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.444247 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.445371 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.446106 4931 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.446443 4931 status_manager.go:851] "Failed to get status for pod" podUID="3eee31af-f548-4846-ae56-affc50023793" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.490268 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.490366 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.490371 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.490463 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.490479 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.490518 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.490847 4931 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.490865 4931 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.490877 4931 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.615295 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.616916 4931 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55" exitCode=0 Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.617007 4931 scope.go:117] "RemoveContainer" containerID="2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.617067 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.619678 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3eee31af-f548-4846-ae56-affc50023793","Type":"ContainerDied","Data":"cc1f8c18d0238f8d2af931d0bec408ca9c508b9b8cd4a96b206faafd46bac800"} Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.619721 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc1f8c18d0238f8d2af931d0bec408ca9c508b9b8cd4a96b206faafd46bac800" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.619739 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.625107 4931 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.626033 4931 status_manager.go:851] "Failed to get status for pod" podUID="3eee31af-f548-4846-ae56-affc50023793" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.637629 4931 scope.go:117] "RemoveContainer" containerID="87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.637615 4931 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.638318 4931 status_manager.go:851] "Failed to get status for pod" podUID="3eee31af-f548-4846-ae56-affc50023793" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:58 crc kubenswrapper[4931]: E1201 15:04:58.651898 4931 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:58 crc kubenswrapper[4931]: E1201 15:04:58.652582 4931 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.652682 4931 scope.go:117] "RemoveContainer" containerID="74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a" Dec 01 15:04:58 crc kubenswrapper[4931]: E1201 15:04:58.653124 4931 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:58 crc kubenswrapper[4931]: E1201 15:04:58.653715 4931 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:58 crc kubenswrapper[4931]: E1201 15:04:58.654214 4931 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.654276 4931 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 01 15:04:58 crc kubenswrapper[4931]: E1201 15:04:58.654764 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="200ms" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.672488 4931 scope.go:117] "RemoveContainer" containerID="8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.688584 4931 scope.go:117] "RemoveContainer" containerID="f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.711021 4931 scope.go:117] "RemoveContainer" containerID="ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.737086 4931 scope.go:117] "RemoveContainer" containerID="2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7" Dec 01 15:04:58 crc kubenswrapper[4931]: E1201 15:04:58.737671 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\": container with ID starting with 2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7 not found: ID does not exist" containerID="2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.737762 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7"} err="failed to get container status \"2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\": rpc error: code = NotFound desc = could not find container \"2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7\": container with ID starting with 2ed4cb70e46ab6ed332dd154092ebb010dc869fc7bb7126483dd363e1af9c3b7 not found: ID does not exist" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.737832 4931 scope.go:117] "RemoveContainer" containerID="87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c" Dec 01 15:04:58 crc kubenswrapper[4931]: E1201 15:04:58.738229 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\": container with ID starting with 87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c not found: ID does not exist" containerID="87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.738261 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c"} err="failed to get container status \"87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\": rpc error: code = NotFound desc = could not find container \"87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c\": container with ID starting with 87773b2d60ae025ff21c54f608ca0d7057773cd75d21b6f694f0b12f7f49e38c not found: ID does not exist" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.738284 4931 scope.go:117] "RemoveContainer" containerID="74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a" Dec 01 15:04:58 crc kubenswrapper[4931]: E1201 15:04:58.738716 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\": container with ID starting with 74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a not found: ID does not exist" containerID="74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.738738 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a"} err="failed to get container status \"74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\": rpc error: code = NotFound desc = could not find container \"74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a\": container with ID starting with 74330c24dcf1c7ea5abd7e3b43629d99295fd5c516d16087fb4c54dbfce3a84a not found: ID does not exist" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.738753 4931 scope.go:117] "RemoveContainer" containerID="8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf" Dec 01 15:04:58 crc kubenswrapper[4931]: E1201 15:04:58.739130 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\": container with ID starting with 8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf not found: ID does not exist" containerID="8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.739164 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf"} err="failed to get container status \"8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\": rpc error: code = NotFound desc = could not find container \"8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf\": container with ID starting with 8576f9da9d778fe8a66830ae244a2deb02f925594278fa677eb6970b5bfd4abf not found: ID does not exist" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.739181 4931 scope.go:117] "RemoveContainer" containerID="f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55" Dec 01 15:04:58 crc kubenswrapper[4931]: E1201 15:04:58.739797 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\": container with ID starting with f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55 not found: ID does not exist" containerID="f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.739822 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55"} err="failed to get container status \"f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\": rpc error: code = NotFound desc = could not find container \"f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55\": container with ID starting with f50346205a26491110886ff6ccaad0a18c82c1495c13f1f6894304a27bad8b55 not found: ID does not exist" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.739841 4931 scope.go:117] "RemoveContainer" containerID="ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf" Dec 01 15:04:58 crc kubenswrapper[4931]: E1201 15:04:58.741264 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\": container with ID starting with ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf not found: ID does not exist" containerID="ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf" Dec 01 15:04:58 crc kubenswrapper[4931]: I1201 15:04:58.741326 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf"} err="failed to get container status \"ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\": rpc error: code = NotFound desc = could not find container \"ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf\": container with ID starting with ab5f7341ae162b4ae70bec00d24fa5a3ad656f328db55eec3e3313da12c7b7bf not found: ID does not exist" Dec 01 15:04:58 crc kubenswrapper[4931]: E1201 15:04:58.855163 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="400ms" Dec 01 15:04:59 crc kubenswrapper[4931]: E1201 15:04:59.256169 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="800ms" Dec 01 15:05:00 crc kubenswrapper[4931]: E1201 15:05:00.057349 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="1.6s" Dec 01 15:05:00 crc kubenswrapper[4931]: I1201 15:05:00.249169 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 01 15:05:01 crc kubenswrapper[4931]: E1201 15:05:01.658446 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="3.2s" Dec 01 15:05:04 crc kubenswrapper[4931]: I1201 15:05:04.247421 4931 status_manager.go:851] "Failed to get status for pod" podUID="3eee31af-f548-4846-ae56-affc50023793" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:05:04 crc kubenswrapper[4931]: E1201 15:05:04.860145 4931 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="6.4s" Dec 01 15:05:06 crc kubenswrapper[4931]: E1201 15:05:06.144812 4931 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.2:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d1fb6159a12fc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 15:04:55.906030332 +0000 UTC m=+242.331903999,LastTimestamp:2025-12-01 15:04:55.906030332 +0000 UTC m=+242.331903999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 15:05:07 crc kubenswrapper[4931]: E1201 15:05:07.312672 4931 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.2:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" volumeName="registry-storage" Dec 01 15:05:09 crc kubenswrapper[4931]: I1201 15:05:09.709540 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 15:05:09 crc kubenswrapper[4931]: I1201 15:05:09.710132 4931 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d" exitCode=1 Dec 01 15:05:09 crc kubenswrapper[4931]: I1201 15:05:09.710185 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d"} Dec 01 15:05:09 crc kubenswrapper[4931]: I1201 15:05:09.710861 4931 scope.go:117] "RemoveContainer" containerID="8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d" Dec 01 15:05:09 crc kubenswrapper[4931]: I1201 15:05:09.712435 4931 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:05:09 crc kubenswrapper[4931]: I1201 15:05:09.713090 4931 status_manager.go:851] "Failed to get status for pod" podUID="3eee31af-f548-4846-ae56-affc50023793" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:05:09 crc kubenswrapper[4931]: I1201 15:05:09.990775 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.240937 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.243607 4931 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.244839 4931 status_manager.go:851] "Failed to get status for pod" podUID="3eee31af-f548-4846-ae56-affc50023793" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.267950 4931 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.268195 4931 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:10 crc kubenswrapper[4931]: E1201 15:05:10.269136 4931 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.269811 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.726819 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.726993 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a17273549b5c88cea172376183b68f97e94fca6912150cf1061adceec829e221"} Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.728851 4931 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.729644 4931 status_manager.go:851] "Failed to get status for pod" podUID="3eee31af-f548-4846-ae56-affc50023793" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.729954 4931 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c871d95b7561f3bd92c16cefdf1b3699eb7773a56d45e8e88f49d95763b43665" exitCode=0 Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.730002 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c871d95b7561f3bd92c16cefdf1b3699eb7773a56d45e8e88f49d95763b43665"} Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.730031 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9dba609958b2b59018eef31258488eb96c0207ca9779650909a2a5271fda0260"} Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.730459 4931 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.730490 4931 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.731195 4931 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:05:10 crc kubenswrapper[4931]: E1201 15:05:10.731293 4931 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:05:10 crc kubenswrapper[4931]: I1201 15:05:10.731562 4931 status_manager.go:851] "Failed to get status for pod" podUID="3eee31af-f548-4846-ae56-affc50023793" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 01 15:05:11 crc kubenswrapper[4931]: I1201 15:05:11.327429 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:05:11 crc kubenswrapper[4931]: I1201 15:05:11.327839 4931 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 15:05:11 crc kubenswrapper[4931]: I1201 15:05:11.327998 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 15:05:11 crc kubenswrapper[4931]: I1201 15:05:11.764798 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d5edb242fcf32e6d41666d9c3caca9867e796d79de56382a3143f856f997007b"} Dec 01 15:05:11 crc kubenswrapper[4931]: I1201 15:05:11.764862 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d37857e106a696feb95c2107a5e9c0c32ba681031bdbe7755bffb750b96d2ec9"} Dec 01 15:05:11 crc kubenswrapper[4931]: I1201 15:05:11.764873 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5c8f7de349a2033d019ae3ced0b010cfb16d55d19976cee08104303975569622"} Dec 01 15:05:12 crc kubenswrapper[4931]: I1201 15:05:12.774120 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bacfb83b990a5d086bf03b87044e511f5d46017e10c1a53681713bb261882481"} Dec 01 15:05:12 crc kubenswrapper[4931]: I1201 15:05:12.774512 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bdb17efb694816dad1458b19865950d46e449171fccfb6e5d3c98f98bc5a1cda"} Dec 01 15:05:12 crc kubenswrapper[4931]: I1201 15:05:12.774526 4931 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:12 crc kubenswrapper[4931]: I1201 15:05:12.774563 4931 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:15 crc kubenswrapper[4931]: I1201 15:05:15.270576 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:05:15 crc kubenswrapper[4931]: I1201 15:05:15.270686 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:05:15 crc kubenswrapper[4931]: I1201 15:05:15.280180 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:05:15 crc kubenswrapper[4931]: I1201 15:05:15.555629 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:05:17 crc kubenswrapper[4931]: I1201 15:05:17.788091 4931 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:05:17 crc kubenswrapper[4931]: I1201 15:05:17.852053 4931 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d9c845ab-db66-449f-99a6-97d9b1610c4e" Dec 01 15:05:18 crc kubenswrapper[4931]: I1201 15:05:18.809338 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:05:18 crc kubenswrapper[4931]: I1201 15:05:18.809453 4931 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:18 crc kubenswrapper[4931]: I1201 15:05:18.809781 4931 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:18 crc kubenswrapper[4931]: I1201 15:05:18.813263 4931 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d9c845ab-db66-449f-99a6-97d9b1610c4e" Dec 01 15:05:18 crc kubenswrapper[4931]: I1201 15:05:18.814709 4931 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://5c8f7de349a2033d019ae3ced0b010cfb16d55d19976cee08104303975569622" Dec 01 15:05:18 crc kubenswrapper[4931]: I1201 15:05:18.814762 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:05:19 crc kubenswrapper[4931]: I1201 15:05:19.818180 4931 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:19 crc kubenswrapper[4931]: I1201 15:05:19.818246 4931 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:19 crc kubenswrapper[4931]: I1201 15:05:19.823051 4931 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d9c845ab-db66-449f-99a6-97d9b1610c4e" Dec 01 15:05:20 crc kubenswrapper[4931]: I1201 15:05:20.826359 4931 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:20 crc kubenswrapper[4931]: I1201 15:05:20.826447 4931 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:20 crc kubenswrapper[4931]: I1201 15:05:20.832171 4931 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d9c845ab-db66-449f-99a6-97d9b1610c4e" Dec 01 15:05:21 crc kubenswrapper[4931]: I1201 15:05:21.328189 4931 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 15:05:21 crc kubenswrapper[4931]: I1201 15:05:21.328312 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 15:05:27 crc kubenswrapper[4931]: I1201 15:05:27.629195 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 15:05:27 crc kubenswrapper[4931]: I1201 15:05:27.788844 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 15:05:27 crc kubenswrapper[4931]: I1201 15:05:27.924118 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 15:05:28 crc kubenswrapper[4931]: I1201 15:05:28.049642 4931 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 15:05:28 crc kubenswrapper[4931]: I1201 15:05:28.198223 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 15:05:28 crc kubenswrapper[4931]: I1201 15:05:28.256159 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 15:05:28 crc kubenswrapper[4931]: I1201 15:05:28.299584 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 15:05:28 crc kubenswrapper[4931]: I1201 15:05:28.574489 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 15:05:29 crc kubenswrapper[4931]: I1201 15:05:29.010526 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 15:05:29 crc kubenswrapper[4931]: I1201 15:05:29.348721 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 15:05:29 crc kubenswrapper[4931]: I1201 15:05:29.414921 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 15:05:29 crc kubenswrapper[4931]: I1201 15:05:29.765955 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 15:05:29 crc kubenswrapper[4931]: I1201 15:05:29.768536 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 15:05:29 crc kubenswrapper[4931]: I1201 15:05:29.832720 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 15:05:29 crc kubenswrapper[4931]: I1201 15:05:29.909219 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 15:05:30 crc kubenswrapper[4931]: I1201 15:05:30.730670 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 15:05:30 crc kubenswrapper[4931]: I1201 15:05:30.746300 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 15:05:30 crc kubenswrapper[4931]: I1201 15:05:30.757947 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 15:05:30 crc kubenswrapper[4931]: I1201 15:05:30.775463 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 15:05:30 crc kubenswrapper[4931]: I1201 15:05:30.789611 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 15:05:30 crc kubenswrapper[4931]: I1201 15:05:30.898165 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.026504 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.171778 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.242231 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.275330 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.275433 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.306416 4931 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.327721 4931 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.327792 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.327853 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.328597 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"a17273549b5c88cea172376183b68f97e94fca6912150cf1061adceec829e221"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.328734 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://a17273549b5c88cea172376183b68f97e94fca6912150cf1061adceec829e221" gracePeriod=30 Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.454058 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.556695 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.569185 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.570435 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.602516 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.628912 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.709210 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.733861 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.909833 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 15:05:31 crc kubenswrapper[4931]: I1201 15:05:31.918276 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 15:05:32 crc kubenswrapper[4931]: I1201 15:05:32.008742 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 15:05:32 crc kubenswrapper[4931]: I1201 15:05:32.017265 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 15:05:32 crc kubenswrapper[4931]: I1201 15:05:32.035292 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 15:05:32 crc kubenswrapper[4931]: I1201 15:05:32.091740 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 15:05:32 crc kubenswrapper[4931]: I1201 15:05:32.477446 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 15:05:32 crc kubenswrapper[4931]: I1201 15:05:32.539124 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 15:05:32 crc kubenswrapper[4931]: I1201 15:05:32.545235 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 15:05:32 crc kubenswrapper[4931]: I1201 15:05:32.556122 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 15:05:32 crc kubenswrapper[4931]: I1201 15:05:32.612154 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 15:05:32 crc kubenswrapper[4931]: I1201 15:05:32.902127 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 15:05:32 crc kubenswrapper[4931]: I1201 15:05:32.916453 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 15:05:32 crc kubenswrapper[4931]: I1201 15:05:32.933548 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.033058 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.168794 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.183499 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.227785 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.267520 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.275545 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.302511 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.337875 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.462806 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.541058 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.543060 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.570137 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.614446 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.633324 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.639891 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.646712 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.698296 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.777494 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.879269 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 15:05:33 crc kubenswrapper[4931]: I1201 15:05:33.979225 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 15:05:34 crc kubenswrapper[4931]: I1201 15:05:34.143790 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 15:05:34 crc kubenswrapper[4931]: I1201 15:05:34.317929 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 15:05:34 crc kubenswrapper[4931]: I1201 15:05:34.428631 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 15:05:34 crc kubenswrapper[4931]: I1201 15:05:34.501173 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 15:05:34 crc kubenswrapper[4931]: I1201 15:05:34.696073 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 15:05:34 crc kubenswrapper[4931]: I1201 15:05:34.733065 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 15:05:34 crc kubenswrapper[4931]: I1201 15:05:34.786690 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 15:05:34 crc kubenswrapper[4931]: I1201 15:05:34.892461 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 15:05:34 crc kubenswrapper[4931]: I1201 15:05:34.899941 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 15:05:34 crc kubenswrapper[4931]: I1201 15:05:34.914597 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 15:05:34 crc kubenswrapper[4931]: I1201 15:05:34.929941 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 15:05:34 crc kubenswrapper[4931]: I1201 15:05:34.931734 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 15:05:34 crc kubenswrapper[4931]: I1201 15:05:34.975014 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 15:05:35 crc kubenswrapper[4931]: I1201 15:05:35.005799 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 15:05:35 crc kubenswrapper[4931]: I1201 15:05:35.068258 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 15:05:35 crc kubenswrapper[4931]: I1201 15:05:35.122933 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 15:05:35 crc kubenswrapper[4931]: I1201 15:05:35.192944 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 15:05:35 crc kubenswrapper[4931]: I1201 15:05:35.256464 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 15:05:35 crc kubenswrapper[4931]: I1201 15:05:35.261104 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 15:05:35 crc kubenswrapper[4931]: I1201 15:05:35.265623 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 15:05:35 crc kubenswrapper[4931]: I1201 15:05:35.347169 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 15:05:35 crc kubenswrapper[4931]: I1201 15:05:35.725359 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 15:05:35 crc kubenswrapper[4931]: I1201 15:05:35.740252 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 15:05:35 crc kubenswrapper[4931]: I1201 15:05:35.773931 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 15:05:35 crc kubenswrapper[4931]: I1201 15:05:35.837315 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 15:05:35 crc kubenswrapper[4931]: I1201 15:05:35.838037 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.024710 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.032427 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.038166 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.080612 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.109634 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.118256 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.192887 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.214854 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.217683 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.378994 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.403917 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.420301 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.482379 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.505726 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.544004 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.561051 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.663246 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.667689 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.751353 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.754034 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 15:05:36 crc kubenswrapper[4931]: I1201 15:05:36.946772 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.082754 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.164519 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.200869 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.227512 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.298203 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.320713 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.337956 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.591813 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.667875 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.677863 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.850748 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.879190 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.908928 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 15:05:37 crc kubenswrapper[4931]: I1201 15:05:37.942155 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.046787 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.109617 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.141217 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.142606 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.164422 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.224415 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.246348 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.247140 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.265238 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.315521 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.326615 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.332134 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.376538 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.399880 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.467694 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.516564 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.525341 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.626598 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.669574 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.754857 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.760686 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.782898 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.804464 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.804747 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.918004 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.937001 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.948357 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 15:05:38 crc kubenswrapper[4931]: I1201 15:05:38.990055 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.015844 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.141079 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.234618 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.382892 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.443501 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.491180 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.569717 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.569919 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.641147 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.698631 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.698745 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.720700 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.837555 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.892042 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.898086 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.939020 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 15:05:39 crc kubenswrapper[4931]: I1201 15:05:39.965773 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.082354 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.094975 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.171543 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.253483 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.379531 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.454706 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.559175 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.666301 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.765773 4931 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.786479 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.786620 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-6869cbc5df-d7r2w"] Dec 01 15:05:40 crc kubenswrapper[4931]: E1201 15:05:40.786973 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eee31af-f548-4846-ae56-affc50023793" containerName="installer" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.786997 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eee31af-f548-4846-ae56-affc50023793" containerName="installer" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.787172 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eee31af-f548-4846-ae56-affc50023793" containerName="installer" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.787996 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.788259 4931 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.788315 4931 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="1299bbfe-9ffb-483a-ba5a-ea391efdc803" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.794685 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.794961 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.795194 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.795956 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.802462 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.803426 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.803677 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.804213 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.810764 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.812313 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.815957 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.816026 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.816675 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.836778 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.836751151 podStartE2EDuration="23.836751151s" podCreationTimestamp="2025-12-01 15:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:05:40.83362046 +0000 UTC m=+287.259494147" watchObservedRunningTime="2025-12-01 15:05:40.836751151 +0000 UTC m=+287.262624838" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.846144 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.854107 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.861254 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.898054 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.900438 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912065 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skp9p\" (UniqueName: \"kubernetes.io/projected/9296df7d-06e7-4f39-9e2f-05218f833aec-kube-api-access-skp9p\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912116 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-session\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912151 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912173 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-user-template-error\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912194 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9296df7d-06e7-4f39-9e2f-05218f833aec-audit-policies\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912211 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-user-template-login\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912232 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912247 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912282 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912306 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-router-certs\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912326 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9296df7d-06e7-4f39-9e2f-05218f833aec-audit-dir\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912354 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-service-ca\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912370 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:40 crc kubenswrapper[4931]: I1201 15:05:40.912425 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.014612 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.014703 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-user-template-error\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.014753 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9296df7d-06e7-4f39-9e2f-05218f833aec-audit-policies\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.014792 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-user-template-login\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.014840 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.014884 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.014953 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.014996 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-router-certs\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.015035 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9296df7d-06e7-4f39-9e2f-05218f833aec-audit-dir\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.015082 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-service-ca\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.015122 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.015175 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.015219 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skp9p\" (UniqueName: \"kubernetes.io/projected/9296df7d-06e7-4f39-9e2f-05218f833aec-kube-api-access-skp9p\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.015269 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-session\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.015654 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9296df7d-06e7-4f39-9e2f-05218f833aec-audit-policies\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.015723 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9296df7d-06e7-4f39-9e2f-05218f833aec-audit-dir\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.016116 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.016956 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-service-ca\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.017672 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.022974 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.022969 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.023177 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-router-certs\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.023264 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.023314 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-session\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.023426 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-user-template-login\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.024912 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-user-template-error\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.027251 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9296df7d-06e7-4f39-9e2f-05218f833aec-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.033008 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skp9p\" (UniqueName: \"kubernetes.io/projected/9296df7d-06e7-4f39-9e2f-05218f833aec-kube-api-access-skp9p\") pod \"oauth-openshift-6869cbc5df-d7r2w\" (UID: \"9296df7d-06e7-4f39-9e2f-05218f833aec\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.038366 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.100937 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.116931 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.136171 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.210263 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.280740 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.445881 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.507426 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.537994 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.556303 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6869cbc5df-d7r2w"] Dec 01 15:05:41 crc kubenswrapper[4931]: W1201 15:05:41.559191 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9296df7d_06e7_4f39_9e2f_05218f833aec.slice/crio-7a4b78989599160af33311f1e682122723b11ea63088f1e898040782ff053070 WatchSource:0}: Error finding container 7a4b78989599160af33311f1e682122723b11ea63088f1e898040782ff053070: Status 404 returned error can't find the container with id 7a4b78989599160af33311f1e682122723b11ea63088f1e898040782ff053070 Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.569317 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.617515 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.624654 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.674748 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.712910 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.738040 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.757296 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.769967 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.838267 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.879165 4931 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.955079 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.967297 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" event={"ID":"9296df7d-06e7-4f39-9e2f-05218f833aec","Type":"ContainerStarted","Data":"37fe7240d7490e251e3e6fe46df7c29fe828f67612bafbd17ced8355f40d8f1c"} Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.967349 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" event={"ID":"9296df7d-06e7-4f39-9e2f-05218f833aec","Type":"ContainerStarted","Data":"7a4b78989599160af33311f1e682122723b11ea63088f1e898040782ff053070"} Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.967850 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.978606 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 15:05:41 crc kubenswrapper[4931]: I1201 15:05:41.991773 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" podStartSLOduration=79.99175344 podStartE2EDuration="1m19.99175344s" podCreationTimestamp="2025-12-01 15:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:05:41.988464175 +0000 UTC m=+288.414337842" watchObservedRunningTime="2025-12-01 15:05:41.99175344 +0000 UTC m=+288.417627137" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.014958 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.045186 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.117611 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.176786 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.198477 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6869cbc5df-d7r2w" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.214573 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.278598 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.300997 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.433709 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.548201 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.602809 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.790319 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.818984 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 15:05:42 crc kubenswrapper[4931]: I1201 15:05:42.981979 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 15:05:43 crc kubenswrapper[4931]: I1201 15:05:43.000214 4931 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 15:05:43 crc kubenswrapper[4931]: I1201 15:05:43.025467 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 15:05:43 crc kubenswrapper[4931]: I1201 15:05:43.086256 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 15:05:43 crc kubenswrapper[4931]: I1201 15:05:43.129359 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 15:05:43 crc kubenswrapper[4931]: I1201 15:05:43.195365 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 15:05:43 crc kubenswrapper[4931]: I1201 15:05:43.287022 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 15:05:43 crc kubenswrapper[4931]: I1201 15:05:43.503941 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 15:05:43 crc kubenswrapper[4931]: I1201 15:05:43.789375 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 15:05:43 crc kubenswrapper[4931]: I1201 15:05:43.952326 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 15:05:44 crc kubenswrapper[4931]: I1201 15:05:44.023737 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 15:05:44 crc kubenswrapper[4931]: I1201 15:05:44.029775 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 15:05:44 crc kubenswrapper[4931]: I1201 15:05:44.115956 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 15:05:44 crc kubenswrapper[4931]: I1201 15:05:44.200132 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 15:05:44 crc kubenswrapper[4931]: I1201 15:05:44.240614 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 15:05:45 crc kubenswrapper[4931]: I1201 15:05:45.591276 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 15:05:51 crc kubenswrapper[4931]: I1201 15:05:51.792328 4931 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 15:05:51 crc kubenswrapper[4931]: I1201 15:05:51.794065 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://5a8f3a581d83a9a876ad7036013f352e24956b06748fa4d3a791b936ee46b581" gracePeriod=5 Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.088002 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.088996 4931 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="5a8f3a581d83a9a876ad7036013f352e24956b06748fa4d3a791b936ee46b581" exitCode=137 Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.393571 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.393695 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.475604 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.475749 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.475785 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.475819 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.475884 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.475927 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.475926 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.476027 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.476062 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.476261 4931 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.476278 4931 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.476292 4931 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.476304 4931 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.485529 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:05:57 crc kubenswrapper[4931]: I1201 15:05:57.577370 4931 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 15:05:58 crc kubenswrapper[4931]: I1201 15:05:58.098471 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 15:05:58 crc kubenswrapper[4931]: I1201 15:05:58.098670 4931 scope.go:117] "RemoveContainer" containerID="5a8f3a581d83a9a876ad7036013f352e24956b06748fa4d3a791b936ee46b581" Dec 01 15:05:58 crc kubenswrapper[4931]: I1201 15:05:58.098745 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 15:05:58 crc kubenswrapper[4931]: I1201 15:05:58.101970 4931 generic.go:334] "Generic (PLEG): container finished" podID="98a131c9-fc6c-4a27-a774-227258b380c0" containerID="9dc35aace1b82369a302f408af3779655c31571b76bee8ff330872097cb54890" exitCode=0 Dec 01 15:05:58 crc kubenswrapper[4931]: I1201 15:05:58.102025 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" event={"ID":"98a131c9-fc6c-4a27-a774-227258b380c0","Type":"ContainerDied","Data":"9dc35aace1b82369a302f408af3779655c31571b76bee8ff330872097cb54890"} Dec 01 15:05:58 crc kubenswrapper[4931]: I1201 15:05:58.102869 4931 scope.go:117] "RemoveContainer" containerID="9dc35aace1b82369a302f408af3779655c31571b76bee8ff330872097cb54890" Dec 01 15:05:58 crc kubenswrapper[4931]: I1201 15:05:58.249010 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 01 15:05:59 crc kubenswrapper[4931]: I1201 15:05:59.116409 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" event={"ID":"98a131c9-fc6c-4a27-a774-227258b380c0","Type":"ContainerStarted","Data":"0cdc2764b445ae2f9bf7c48c706b9c0078cf6bc2aab4ecb64ca04314c92a4e35"} Dec 01 15:05:59 crc kubenswrapper[4931]: I1201 15:05:59.118062 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:05:59 crc kubenswrapper[4931]: I1201 15:05:59.122787 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:06:02 crc kubenswrapper[4931]: I1201 15:06:02.141913 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 01 15:06:02 crc kubenswrapper[4931]: I1201 15:06:02.145511 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 15:06:02 crc kubenswrapper[4931]: I1201 15:06:02.145593 4931 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a17273549b5c88cea172376183b68f97e94fca6912150cf1061adceec829e221" exitCode=137 Dec 01 15:06:02 crc kubenswrapper[4931]: I1201 15:06:02.145638 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a17273549b5c88cea172376183b68f97e94fca6912150cf1061adceec829e221"} Dec 01 15:06:02 crc kubenswrapper[4931]: I1201 15:06:02.145692 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1899e86c79bb02f5c45201fb12dfe7256e9137bb4dffe6ab825b536d3b36900f"} Dec 01 15:06:02 crc kubenswrapper[4931]: I1201 15:06:02.145723 4931 scope.go:117] "RemoveContainer" containerID="8cd183ece28d9d96b8f64f9887b7ad7c2b06514f3ddedd64e8019b4a7ad3cd4d" Dec 01 15:06:03 crc kubenswrapper[4931]: I1201 15:06:03.167276 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 01 15:06:05 crc kubenswrapper[4931]: I1201 15:06:05.555574 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:06:11 crc kubenswrapper[4931]: I1201 15:06:11.327207 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:06:11 crc kubenswrapper[4931]: I1201 15:06:11.331060 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:06:12 crc kubenswrapper[4931]: I1201 15:06:12.233094 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 15:06:15 crc kubenswrapper[4931]: I1201 15:06:15.189287 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 15:06:15 crc kubenswrapper[4931]: I1201 15:06:15.775173 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 15:06:20 crc kubenswrapper[4931]: I1201 15:06:20.240470 4931 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 15:06:29 crc kubenswrapper[4931]: I1201 15:06:29.805750 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm"] Dec 01 15:06:29 crc kubenswrapper[4931]: I1201 15:06:29.806687 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" podUID="1024faa3-55d0-47a5-ad2e-745ec92c0c89" containerName="route-controller-manager" containerID="cri-o://b81c33ae96360222912c6c32dd12da1802b97b0c6f102857b774531e3c80ba3e" gracePeriod=30 Dec 01 15:06:29 crc kubenswrapper[4931]: I1201 15:06:29.814859 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ng4qc"] Dec 01 15:06:29 crc kubenswrapper[4931]: I1201 15:06:29.815164 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" podUID="10126253-d0e8-4f30-9047-1780a718e251" containerName="controller-manager" containerID="cri-o://d2ce9de9ba2aa1cc21f92a98118eac9dccf5e5cb27e58d6875547320d7bf2e33" gracePeriod=30 Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.227419 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.231529 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.268242 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr7b4\" (UniqueName: \"kubernetes.io/projected/10126253-d0e8-4f30-9047-1780a718e251-kube-api-access-fr7b4\") pod \"10126253-d0e8-4f30-9047-1780a718e251\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.268309 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1024faa3-55d0-47a5-ad2e-745ec92c0c89-config\") pod \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.268428 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10126253-d0e8-4f30-9047-1780a718e251-serving-cert\") pod \"10126253-d0e8-4f30-9047-1780a718e251\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.268478 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1024faa3-55d0-47a5-ad2e-745ec92c0c89-serving-cert\") pod \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.268499 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-config\") pod \"10126253-d0e8-4f30-9047-1780a718e251\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.268525 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1024faa3-55d0-47a5-ad2e-745ec92c0c89-client-ca\") pod \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.268571 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-proxy-ca-bundles\") pod \"10126253-d0e8-4f30-9047-1780a718e251\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.268600 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-client-ca\") pod \"10126253-d0e8-4f30-9047-1780a718e251\" (UID: \"10126253-d0e8-4f30-9047-1780a718e251\") " Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.268649 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9xwt\" (UniqueName: \"kubernetes.io/projected/1024faa3-55d0-47a5-ad2e-745ec92c0c89-kube-api-access-v9xwt\") pod \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\" (UID: \"1024faa3-55d0-47a5-ad2e-745ec92c0c89\") " Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.269934 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-config" (OuterVolumeSpecName: "config") pod "10126253-d0e8-4f30-9047-1780a718e251" (UID: "10126253-d0e8-4f30-9047-1780a718e251"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.269977 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "10126253-d0e8-4f30-9047-1780a718e251" (UID: "10126253-d0e8-4f30-9047-1780a718e251"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.270399 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1024faa3-55d0-47a5-ad2e-745ec92c0c89-client-ca" (OuterVolumeSpecName: "client-ca") pod "1024faa3-55d0-47a5-ad2e-745ec92c0c89" (UID: "1024faa3-55d0-47a5-ad2e-745ec92c0c89"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.270623 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-client-ca" (OuterVolumeSpecName: "client-ca") pod "10126253-d0e8-4f30-9047-1780a718e251" (UID: "10126253-d0e8-4f30-9047-1780a718e251"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.271531 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1024faa3-55d0-47a5-ad2e-745ec92c0c89-config" (OuterVolumeSpecName: "config") pod "1024faa3-55d0-47a5-ad2e-745ec92c0c89" (UID: "1024faa3-55d0-47a5-ad2e-745ec92c0c89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.276305 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1024faa3-55d0-47a5-ad2e-745ec92c0c89-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1024faa3-55d0-47a5-ad2e-745ec92c0c89" (UID: "1024faa3-55d0-47a5-ad2e-745ec92c0c89"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.277810 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10126253-d0e8-4f30-9047-1780a718e251-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "10126253-d0e8-4f30-9047-1780a718e251" (UID: "10126253-d0e8-4f30-9047-1780a718e251"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.278706 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10126253-d0e8-4f30-9047-1780a718e251-kube-api-access-fr7b4" (OuterVolumeSpecName: "kube-api-access-fr7b4") pod "10126253-d0e8-4f30-9047-1780a718e251" (UID: "10126253-d0e8-4f30-9047-1780a718e251"). InnerVolumeSpecName "kube-api-access-fr7b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.278952 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1024faa3-55d0-47a5-ad2e-745ec92c0c89-kube-api-access-v9xwt" (OuterVolumeSpecName: "kube-api-access-v9xwt") pod "1024faa3-55d0-47a5-ad2e-745ec92c0c89" (UID: "1024faa3-55d0-47a5-ad2e-745ec92c0c89"). InnerVolumeSpecName "kube-api-access-v9xwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.340285 4931 generic.go:334] "Generic (PLEG): container finished" podID="10126253-d0e8-4f30-9047-1780a718e251" containerID="d2ce9de9ba2aa1cc21f92a98118eac9dccf5e5cb27e58d6875547320d7bf2e33" exitCode=0 Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.340402 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.340370 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" event={"ID":"10126253-d0e8-4f30-9047-1780a718e251","Type":"ContainerDied","Data":"d2ce9de9ba2aa1cc21f92a98118eac9dccf5e5cb27e58d6875547320d7bf2e33"} Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.340950 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ng4qc" event={"ID":"10126253-d0e8-4f30-9047-1780a718e251","Type":"ContainerDied","Data":"af896787a11a18255bc4a1159c06753e7b1b57e19c8a52bb6afb58397f56c974"} Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.340983 4931 scope.go:117] "RemoveContainer" containerID="d2ce9de9ba2aa1cc21f92a98118eac9dccf5e5cb27e58d6875547320d7bf2e33" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.341972 4931 generic.go:334] "Generic (PLEG): container finished" podID="1024faa3-55d0-47a5-ad2e-745ec92c0c89" containerID="b81c33ae96360222912c6c32dd12da1802b97b0c6f102857b774531e3c80ba3e" exitCode=0 Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.342011 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" event={"ID":"1024faa3-55d0-47a5-ad2e-745ec92c0c89","Type":"ContainerDied","Data":"b81c33ae96360222912c6c32dd12da1802b97b0c6f102857b774531e3c80ba3e"} Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.342046 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" event={"ID":"1024faa3-55d0-47a5-ad2e-745ec92c0c89","Type":"ContainerDied","Data":"ebbb262a8e8c34c06d48c63da9f6ca39d9b2a84e972ac2728597e0ff984ec062"} Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.342109 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.368877 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm"] Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.369328 4931 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.369356 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9xwt\" (UniqueName: \"kubernetes.io/projected/1024faa3-55d0-47a5-ad2e-745ec92c0c89-kube-api-access-v9xwt\") on node \"crc\" DevicePath \"\"" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.369368 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr7b4\" (UniqueName: \"kubernetes.io/projected/10126253-d0e8-4f30-9047-1780a718e251-kube-api-access-fr7b4\") on node \"crc\" DevicePath \"\"" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.369378 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1024faa3-55d0-47a5-ad2e-745ec92c0c89-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.369400 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10126253-d0e8-4f30-9047-1780a718e251-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.369432 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1024faa3-55d0-47a5-ad2e-745ec92c0c89-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.369440 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.369449 4931 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1024faa3-55d0-47a5-ad2e-745ec92c0c89-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.369475 4931 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10126253-d0e8-4f30-9047-1780a718e251-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.371735 4931 scope.go:117] "RemoveContainer" containerID="d2ce9de9ba2aa1cc21f92a98118eac9dccf5e5cb27e58d6875547320d7bf2e33" Dec 01 15:06:30 crc kubenswrapper[4931]: E1201 15:06:30.372211 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2ce9de9ba2aa1cc21f92a98118eac9dccf5e5cb27e58d6875547320d7bf2e33\": container with ID starting with d2ce9de9ba2aa1cc21f92a98118eac9dccf5e5cb27e58d6875547320d7bf2e33 not found: ID does not exist" containerID="d2ce9de9ba2aa1cc21f92a98118eac9dccf5e5cb27e58d6875547320d7bf2e33" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.372246 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ce9de9ba2aa1cc21f92a98118eac9dccf5e5cb27e58d6875547320d7bf2e33"} err="failed to get container status \"d2ce9de9ba2aa1cc21f92a98118eac9dccf5e5cb27e58d6875547320d7bf2e33\": rpc error: code = NotFound desc = could not find container \"d2ce9de9ba2aa1cc21f92a98118eac9dccf5e5cb27e58d6875547320d7bf2e33\": container with ID starting with d2ce9de9ba2aa1cc21f92a98118eac9dccf5e5cb27e58d6875547320d7bf2e33 not found: ID does not exist" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.372272 4931 scope.go:117] "RemoveContainer" containerID="b81c33ae96360222912c6c32dd12da1802b97b0c6f102857b774531e3c80ba3e" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.374866 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b9wzm"] Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.381241 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ng4qc"] Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.385132 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ng4qc"] Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.389970 4931 scope.go:117] "RemoveContainer" containerID="b81c33ae96360222912c6c32dd12da1802b97b0c6f102857b774531e3c80ba3e" Dec 01 15:06:30 crc kubenswrapper[4931]: E1201 15:06:30.392951 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b81c33ae96360222912c6c32dd12da1802b97b0c6f102857b774531e3c80ba3e\": container with ID starting with b81c33ae96360222912c6c32dd12da1802b97b0c6f102857b774531e3c80ba3e not found: ID does not exist" containerID="b81c33ae96360222912c6c32dd12da1802b97b0c6f102857b774531e3c80ba3e" Dec 01 15:06:30 crc kubenswrapper[4931]: I1201 15:06:30.393024 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81c33ae96360222912c6c32dd12da1802b97b0c6f102857b774531e3c80ba3e"} err="failed to get container status \"b81c33ae96360222912c6c32dd12da1802b97b0c6f102857b774531e3c80ba3e\": rpc error: code = NotFound desc = could not find container \"b81c33ae96360222912c6c32dd12da1802b97b0c6f102857b774531e3c80ba3e\": container with ID starting with b81c33ae96360222912c6c32dd12da1802b97b0c6f102857b774531e3c80ba3e not found: ID does not exist" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.504051 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-986db4786-4lf42"] Dec 01 15:06:31 crc kubenswrapper[4931]: E1201 15:06:31.504288 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10126253-d0e8-4f30-9047-1780a718e251" containerName="controller-manager" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.504302 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="10126253-d0e8-4f30-9047-1780a718e251" containerName="controller-manager" Dec 01 15:06:31 crc kubenswrapper[4931]: E1201 15:06:31.504316 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.504323 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 15:06:31 crc kubenswrapper[4931]: E1201 15:06:31.504336 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1024faa3-55d0-47a5-ad2e-745ec92c0c89" containerName="route-controller-manager" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.504342 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="1024faa3-55d0-47a5-ad2e-745ec92c0c89" containerName="route-controller-manager" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.504451 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="1024faa3-55d0-47a5-ad2e-745ec92c0c89" containerName="route-controller-manager" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.504462 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="10126253-d0e8-4f30-9047-1780a718e251" containerName="controller-manager" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.504469 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.504874 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.508775 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.508971 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.509672 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.510247 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.510716 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.511158 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.519533 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-558d65cfdd-ppv4m"] Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.520723 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.525997 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-986db4786-4lf42"] Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.529221 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.529302 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.529415 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.529832 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.529910 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.533799 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.534925 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.542325 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-558d65cfdd-ppv4m"] Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.584580 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-client-ca\") pod \"route-controller-manager-986db4786-4lf42\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.584685 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hc4b\" (UniqueName: \"kubernetes.io/projected/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-kube-api-access-2hc4b\") pod \"route-controller-manager-986db4786-4lf42\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.584748 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-config\") pod \"route-controller-manager-986db4786-4lf42\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.584775 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-client-ca\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.584836 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-config\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.584868 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-serving-cert\") pod \"route-controller-manager-986db4786-4lf42\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.584898 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-proxy-ca-bundles\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.584921 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6g8k\" (UniqueName: \"kubernetes.io/projected/4e534b96-e041-4d5e-b6f6-730864961ca5-kube-api-access-r6g8k\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.584956 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e534b96-e041-4d5e-b6f6-730864961ca5-serving-cert\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.685734 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-client-ca\") pod \"route-controller-manager-986db4786-4lf42\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.685796 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hc4b\" (UniqueName: \"kubernetes.io/projected/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-kube-api-access-2hc4b\") pod \"route-controller-manager-986db4786-4lf42\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.685827 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-config\") pod \"route-controller-manager-986db4786-4lf42\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.685846 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-client-ca\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.685877 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-config\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.685899 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-serving-cert\") pod \"route-controller-manager-986db4786-4lf42\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.685924 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-proxy-ca-bundles\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.685944 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6g8k\" (UniqueName: \"kubernetes.io/projected/4e534b96-e041-4d5e-b6f6-730864961ca5-kube-api-access-r6g8k\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.685974 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e534b96-e041-4d5e-b6f6-730864961ca5-serving-cert\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.686929 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-client-ca\") pod \"route-controller-manager-986db4786-4lf42\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.687439 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-config\") pod \"route-controller-manager-986db4786-4lf42\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.687675 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-client-ca\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.688171 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-config\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.689021 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-proxy-ca-bundles\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.691589 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e534b96-e041-4d5e-b6f6-730864961ca5-serving-cert\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.702149 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-serving-cert\") pod \"route-controller-manager-986db4786-4lf42\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.704992 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hc4b\" (UniqueName: \"kubernetes.io/projected/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-kube-api-access-2hc4b\") pod \"route-controller-manager-986db4786-4lf42\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.716620 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6g8k\" (UniqueName: \"kubernetes.io/projected/4e534b96-e041-4d5e-b6f6-730864961ca5-kube-api-access-r6g8k\") pod \"controller-manager-558d65cfdd-ppv4m\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.822082 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:31 crc kubenswrapper[4931]: I1201 15:06:31.843599 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.050344 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-558d65cfdd-ppv4m"] Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.160261 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-986db4786-4lf42"] Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.249808 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10126253-d0e8-4f30-9047-1780a718e251" path="/var/lib/kubelet/pods/10126253-d0e8-4f30-9047-1780a718e251/volumes" Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.251353 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1024faa3-55d0-47a5-ad2e-745ec92c0c89" path="/var/lib/kubelet/pods/1024faa3-55d0-47a5-ad2e-745ec92c0c89/volumes" Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.362146 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" event={"ID":"4e534b96-e041-4d5e-b6f6-730864961ca5","Type":"ContainerStarted","Data":"716343b7c4b5331c9f90f95aed76d7ba62aaff5cf5e1989ea21ac570bb7eb412"} Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.362231 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" event={"ID":"4e534b96-e041-4d5e-b6f6-730864961ca5","Type":"ContainerStarted","Data":"0dcb2eb3d4f2ca9bc1c4c0a3d0f508e2dd61d5bb0e822e3266ede7886460fb8f"} Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.362488 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.363746 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" event={"ID":"2cb458d8-f8e8-4293-9d9d-74cbe0657d07","Type":"ContainerStarted","Data":"22d30ebe851f86f76ff3c73af29106809a1d08259547aa9c4983acdaa1887158"} Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.363779 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" event={"ID":"2cb458d8-f8e8-4293-9d9d-74cbe0657d07","Type":"ContainerStarted","Data":"8295143a31825571235c51dd73790a26bab484096697154276ce259f9fc99f32"} Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.363983 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.365289 4931 patch_prober.go:28] interesting pod/route-controller-manager-986db4786-4lf42 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.365349 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" podUID="2cb458d8-f8e8-4293-9d9d-74cbe0657d07" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.367354 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.385626 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" podStartSLOduration=3.3856019059999998 podStartE2EDuration="3.385601906s" podCreationTimestamp="2025-12-01 15:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:06:32.382529258 +0000 UTC m=+338.808402935" watchObservedRunningTime="2025-12-01 15:06:32.385601906 +0000 UTC m=+338.811475563" Dec 01 15:06:32 crc kubenswrapper[4931]: I1201 15:06:32.402846 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" podStartSLOduration=3.40281628 podStartE2EDuration="3.40281628s" podCreationTimestamp="2025-12-01 15:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:06:32.398159037 +0000 UTC m=+338.824032704" watchObservedRunningTime="2025-12-01 15:06:32.40281628 +0000 UTC m=+338.828689947" Dec 01 15:06:33 crc kubenswrapper[4931]: I1201 15:06:33.377450 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:06:49 crc kubenswrapper[4931]: I1201 15:06:49.872082 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:06:49 crc kubenswrapper[4931]: I1201 15:06:49.872972 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:07:02 crc kubenswrapper[4931]: I1201 15:07:02.395070 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-558d65cfdd-ppv4m"] Dec 01 15:07:02 crc kubenswrapper[4931]: I1201 15:07:02.395976 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" podUID="4e534b96-e041-4d5e-b6f6-730864961ca5" containerName="controller-manager" containerID="cri-o://716343b7c4b5331c9f90f95aed76d7ba62aaff5cf5e1989ea21ac570bb7eb412" gracePeriod=30 Dec 01 15:07:02 crc kubenswrapper[4931]: I1201 15:07:02.490570 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-986db4786-4lf42"] Dec 01 15:07:02 crc kubenswrapper[4931]: I1201 15:07:02.490875 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" podUID="2cb458d8-f8e8-4293-9d9d-74cbe0657d07" containerName="route-controller-manager" containerID="cri-o://22d30ebe851f86f76ff3c73af29106809a1d08259547aa9c4983acdaa1887158" gracePeriod=30 Dec 01 15:07:02 crc kubenswrapper[4931]: I1201 15:07:02.560704 4931 generic.go:334] "Generic (PLEG): container finished" podID="4e534b96-e041-4d5e-b6f6-730864961ca5" containerID="716343b7c4b5331c9f90f95aed76d7ba62aaff5cf5e1989ea21ac570bb7eb412" exitCode=0 Dec 01 15:07:02 crc kubenswrapper[4931]: I1201 15:07:02.560760 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" event={"ID":"4e534b96-e041-4d5e-b6f6-730864961ca5","Type":"ContainerDied","Data":"716343b7c4b5331c9f90f95aed76d7ba62aaff5cf5e1989ea21ac570bb7eb412"} Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.037757 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.099699 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.208189 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-client-ca\") pod \"4e534b96-e041-4d5e-b6f6-730864961ca5\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.208243 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-config\") pod \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.208267 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e534b96-e041-4d5e-b6f6-730864961ca5-serving-cert\") pod \"4e534b96-e041-4d5e-b6f6-730864961ca5\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.208296 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-config\") pod \"4e534b96-e041-4d5e-b6f6-730864961ca5\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.208318 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-proxy-ca-bundles\") pod \"4e534b96-e041-4d5e-b6f6-730864961ca5\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.208421 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-client-ca\") pod \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.208446 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hc4b\" (UniqueName: \"kubernetes.io/projected/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-kube-api-access-2hc4b\") pod \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.208513 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6g8k\" (UniqueName: \"kubernetes.io/projected/4e534b96-e041-4d5e-b6f6-730864961ca5-kube-api-access-r6g8k\") pod \"4e534b96-e041-4d5e-b6f6-730864961ca5\" (UID: \"4e534b96-e041-4d5e-b6f6-730864961ca5\") " Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.208544 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-serving-cert\") pod \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\" (UID: \"2cb458d8-f8e8-4293-9d9d-74cbe0657d07\") " Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.209223 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-config" (OuterVolumeSpecName: "config") pod "2cb458d8-f8e8-4293-9d9d-74cbe0657d07" (UID: "2cb458d8-f8e8-4293-9d9d-74cbe0657d07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.209224 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4e534b96-e041-4d5e-b6f6-730864961ca5" (UID: "4e534b96-e041-4d5e-b6f6-730864961ca5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.209314 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-config" (OuterVolumeSpecName: "config") pod "4e534b96-e041-4d5e-b6f6-730864961ca5" (UID: "4e534b96-e041-4d5e-b6f6-730864961ca5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.209878 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-client-ca" (OuterVolumeSpecName: "client-ca") pod "4e534b96-e041-4d5e-b6f6-730864961ca5" (UID: "4e534b96-e041-4d5e-b6f6-730864961ca5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.210147 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-client-ca" (OuterVolumeSpecName: "client-ca") pod "2cb458d8-f8e8-4293-9d9d-74cbe0657d07" (UID: "2cb458d8-f8e8-4293-9d9d-74cbe0657d07"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.215334 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e534b96-e041-4d5e-b6f6-730864961ca5-kube-api-access-r6g8k" (OuterVolumeSpecName: "kube-api-access-r6g8k") pod "4e534b96-e041-4d5e-b6f6-730864961ca5" (UID: "4e534b96-e041-4d5e-b6f6-730864961ca5"). InnerVolumeSpecName "kube-api-access-r6g8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.215342 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e534b96-e041-4d5e-b6f6-730864961ca5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4e534b96-e041-4d5e-b6f6-730864961ca5" (UID: "4e534b96-e041-4d5e-b6f6-730864961ca5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.215591 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-kube-api-access-2hc4b" (OuterVolumeSpecName: "kube-api-access-2hc4b") pod "2cb458d8-f8e8-4293-9d9d-74cbe0657d07" (UID: "2cb458d8-f8e8-4293-9d9d-74cbe0657d07"). InnerVolumeSpecName "kube-api-access-2hc4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.216630 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2cb458d8-f8e8-4293-9d9d-74cbe0657d07" (UID: "2cb458d8-f8e8-4293-9d9d-74cbe0657d07"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.310450 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6g8k\" (UniqueName: \"kubernetes.io/projected/4e534b96-e041-4d5e-b6f6-730864961ca5-kube-api-access-r6g8k\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.310520 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.310543 4931 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.310562 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.310581 4931 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e534b96-e041-4d5e-b6f6-730864961ca5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.310598 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.310615 4931 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4e534b96-e041-4d5e-b6f6-730864961ca5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.310632 4931 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.310648 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hc4b\" (UniqueName: \"kubernetes.io/projected/2cb458d8-f8e8-4293-9d9d-74cbe0657d07-kube-api-access-2hc4b\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.533591 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-769c67ccd8-qlhl2"] Dec 01 15:07:03 crc kubenswrapper[4931]: E1201 15:07:03.534543 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb458d8-f8e8-4293-9d9d-74cbe0657d07" containerName="route-controller-manager" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.534569 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb458d8-f8e8-4293-9d9d-74cbe0657d07" containerName="route-controller-manager" Dec 01 15:07:03 crc kubenswrapper[4931]: E1201 15:07:03.534591 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e534b96-e041-4d5e-b6f6-730864961ca5" containerName="controller-manager" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.534603 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e534b96-e041-4d5e-b6f6-730864961ca5" containerName="controller-manager" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.534776 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb458d8-f8e8-4293-9d9d-74cbe0657d07" containerName="route-controller-manager" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.534797 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e534b96-e041-4d5e-b6f6-730864961ca5" containerName="controller-manager" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.535575 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.541319 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q"] Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.542370 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.550793 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-769c67ccd8-qlhl2"] Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.554951 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q"] Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.587709 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" event={"ID":"4e534b96-e041-4d5e-b6f6-730864961ca5","Type":"ContainerDied","Data":"0dcb2eb3d4f2ca9bc1c4c0a3d0f508e2dd61d5bb0e822e3266ede7886460fb8f"} Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.587791 4931 scope.go:117] "RemoveContainer" containerID="716343b7c4b5331c9f90f95aed76d7ba62aaff5cf5e1989ea21ac570bb7eb412" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.587970 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558d65cfdd-ppv4m" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.591375 4931 generic.go:334] "Generic (PLEG): container finished" podID="2cb458d8-f8e8-4293-9d9d-74cbe0657d07" containerID="22d30ebe851f86f76ff3c73af29106809a1d08259547aa9c4983acdaa1887158" exitCode=0 Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.591437 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" event={"ID":"2cb458d8-f8e8-4293-9d9d-74cbe0657d07","Type":"ContainerDied","Data":"22d30ebe851f86f76ff3c73af29106809a1d08259547aa9c4983acdaa1887158"} Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.591469 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" event={"ID":"2cb458d8-f8e8-4293-9d9d-74cbe0657d07","Type":"ContainerDied","Data":"8295143a31825571235c51dd73790a26bab484096697154276ce259f9fc99f32"} Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.591536 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-986db4786-4lf42" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.619990 4931 scope.go:117] "RemoveContainer" containerID="22d30ebe851f86f76ff3c73af29106809a1d08259547aa9c4983acdaa1887158" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.639692 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-558d65cfdd-ppv4m"] Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.642552 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-558d65cfdd-ppv4m"] Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.643905 4931 scope.go:117] "RemoveContainer" containerID="22d30ebe851f86f76ff3c73af29106809a1d08259547aa9c4983acdaa1887158" Dec 01 15:07:03 crc kubenswrapper[4931]: E1201 15:07:03.648010 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22d30ebe851f86f76ff3c73af29106809a1d08259547aa9c4983acdaa1887158\": container with ID starting with 22d30ebe851f86f76ff3c73af29106809a1d08259547aa9c4983acdaa1887158 not found: ID does not exist" containerID="22d30ebe851f86f76ff3c73af29106809a1d08259547aa9c4983acdaa1887158" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.648054 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22d30ebe851f86f76ff3c73af29106809a1d08259547aa9c4983acdaa1887158"} err="failed to get container status \"22d30ebe851f86f76ff3c73af29106809a1d08259547aa9c4983acdaa1887158\": rpc error: code = NotFound desc = could not find container \"22d30ebe851f86f76ff3c73af29106809a1d08259547aa9c4983acdaa1887158\": container with ID starting with 22d30ebe851f86f76ff3c73af29106809a1d08259547aa9c4983acdaa1887158 not found: ID does not exist" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.650754 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-986db4786-4lf42"] Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.653972 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-986db4786-4lf42"] Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.716581 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e227039-dfcf-4e4f-ba5d-0d578c633e3b-config\") pod \"route-controller-manager-54ccdf98c5-gdb9q\" (UID: \"0e227039-dfcf-4e4f-ba5d-0d578c633e3b\") " pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.717722 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df85436e-5679-4b8e-a1cf-de53a15ff38a-serving-cert\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.717901 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df85436e-5679-4b8e-a1cf-de53a15ff38a-proxy-ca-bundles\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.717995 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w275t\" (UniqueName: \"kubernetes.io/projected/0e227039-dfcf-4e4f-ba5d-0d578c633e3b-kube-api-access-w275t\") pod \"route-controller-manager-54ccdf98c5-gdb9q\" (UID: \"0e227039-dfcf-4e4f-ba5d-0d578c633e3b\") " pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.718115 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df85436e-5679-4b8e-a1cf-de53a15ff38a-config\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.718365 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e227039-dfcf-4e4f-ba5d-0d578c633e3b-client-ca\") pod \"route-controller-manager-54ccdf98c5-gdb9q\" (UID: \"0e227039-dfcf-4e4f-ba5d-0d578c633e3b\") " pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.718414 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df85436e-5679-4b8e-a1cf-de53a15ff38a-client-ca\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.718535 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-285qw\" (UniqueName: \"kubernetes.io/projected/df85436e-5679-4b8e-a1cf-de53a15ff38a-kube-api-access-285qw\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.718580 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e227039-dfcf-4e4f-ba5d-0d578c633e3b-serving-cert\") pod \"route-controller-manager-54ccdf98c5-gdb9q\" (UID: \"0e227039-dfcf-4e4f-ba5d-0d578c633e3b\") " pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.819932 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e227039-dfcf-4e4f-ba5d-0d578c633e3b-client-ca\") pod \"route-controller-manager-54ccdf98c5-gdb9q\" (UID: \"0e227039-dfcf-4e4f-ba5d-0d578c633e3b\") " pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.820000 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df85436e-5679-4b8e-a1cf-de53a15ff38a-client-ca\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.820051 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-285qw\" (UniqueName: \"kubernetes.io/projected/df85436e-5679-4b8e-a1cf-de53a15ff38a-kube-api-access-285qw\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.820078 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e227039-dfcf-4e4f-ba5d-0d578c633e3b-serving-cert\") pod \"route-controller-manager-54ccdf98c5-gdb9q\" (UID: \"0e227039-dfcf-4e4f-ba5d-0d578c633e3b\") " pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.820109 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e227039-dfcf-4e4f-ba5d-0d578c633e3b-config\") pod \"route-controller-manager-54ccdf98c5-gdb9q\" (UID: \"0e227039-dfcf-4e4f-ba5d-0d578c633e3b\") " pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.820138 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df85436e-5679-4b8e-a1cf-de53a15ff38a-serving-cert\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.820170 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df85436e-5679-4b8e-a1cf-de53a15ff38a-proxy-ca-bundles\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.820202 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w275t\" (UniqueName: \"kubernetes.io/projected/0e227039-dfcf-4e4f-ba5d-0d578c633e3b-kube-api-access-w275t\") pod \"route-controller-manager-54ccdf98c5-gdb9q\" (UID: \"0e227039-dfcf-4e4f-ba5d-0d578c633e3b\") " pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.820241 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df85436e-5679-4b8e-a1cf-de53a15ff38a-config\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.821698 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df85436e-5679-4b8e-a1cf-de53a15ff38a-client-ca\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.821862 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e227039-dfcf-4e4f-ba5d-0d578c633e3b-config\") pod \"route-controller-manager-54ccdf98c5-gdb9q\" (UID: \"0e227039-dfcf-4e4f-ba5d-0d578c633e3b\") " pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.822614 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df85436e-5679-4b8e-a1cf-de53a15ff38a-config\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.823224 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df85436e-5679-4b8e-a1cf-de53a15ff38a-proxy-ca-bundles\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.823914 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e227039-dfcf-4e4f-ba5d-0d578c633e3b-client-ca\") pod \"route-controller-manager-54ccdf98c5-gdb9q\" (UID: \"0e227039-dfcf-4e4f-ba5d-0d578c633e3b\") " pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.826079 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df85436e-5679-4b8e-a1cf-de53a15ff38a-serving-cert\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.832196 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e227039-dfcf-4e4f-ba5d-0d578c633e3b-serving-cert\") pod \"route-controller-manager-54ccdf98c5-gdb9q\" (UID: \"0e227039-dfcf-4e4f-ba5d-0d578c633e3b\") " pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.847246 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w275t\" (UniqueName: \"kubernetes.io/projected/0e227039-dfcf-4e4f-ba5d-0d578c633e3b-kube-api-access-w275t\") pod \"route-controller-manager-54ccdf98c5-gdb9q\" (UID: \"0e227039-dfcf-4e4f-ba5d-0d578c633e3b\") " pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.858833 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-285qw\" (UniqueName: \"kubernetes.io/projected/df85436e-5679-4b8e-a1cf-de53a15ff38a-kube-api-access-285qw\") pod \"controller-manager-769c67ccd8-qlhl2\" (UID: \"df85436e-5679-4b8e-a1cf-de53a15ff38a\") " pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.919492 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:03 crc kubenswrapper[4931]: I1201 15:07:03.933726 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:04 crc kubenswrapper[4931]: I1201 15:07:04.196692 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q"] Dec 01 15:07:04 crc kubenswrapper[4931]: I1201 15:07:04.215707 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-769c67ccd8-qlhl2"] Dec 01 15:07:04 crc kubenswrapper[4931]: W1201 15:07:04.225735 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf85436e_5679_4b8e_a1cf_de53a15ff38a.slice/crio-d02d30ca52e885d9de4ab175aeaf40091f00c762381c9276d613b7c700c893ef WatchSource:0}: Error finding container d02d30ca52e885d9de4ab175aeaf40091f00c762381c9276d613b7c700c893ef: Status 404 returned error can't find the container with id d02d30ca52e885d9de4ab175aeaf40091f00c762381c9276d613b7c700c893ef Dec 01 15:07:04 crc kubenswrapper[4931]: I1201 15:07:04.248630 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb458d8-f8e8-4293-9d9d-74cbe0657d07" path="/var/lib/kubelet/pods/2cb458d8-f8e8-4293-9d9d-74cbe0657d07/volumes" Dec 01 15:07:04 crc kubenswrapper[4931]: I1201 15:07:04.249343 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e534b96-e041-4d5e-b6f6-730864961ca5" path="/var/lib/kubelet/pods/4e534b96-e041-4d5e-b6f6-730864961ca5/volumes" Dec 01 15:07:04 crc kubenswrapper[4931]: I1201 15:07:04.608795 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" event={"ID":"0e227039-dfcf-4e4f-ba5d-0d578c633e3b","Type":"ContainerStarted","Data":"6daf14f012f9fadae7789137f660dcbb1bcaa4b173ddc596dc1a2afbd7b5ee1b"} Dec 01 15:07:04 crc kubenswrapper[4931]: I1201 15:07:04.610636 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" event={"ID":"df85436e-5679-4b8e-a1cf-de53a15ff38a","Type":"ContainerStarted","Data":"d02d30ca52e885d9de4ab175aeaf40091f00c762381c9276d613b7c700c893ef"} Dec 01 15:07:05 crc kubenswrapper[4931]: I1201 15:07:05.625768 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" event={"ID":"df85436e-5679-4b8e-a1cf-de53a15ff38a","Type":"ContainerStarted","Data":"338f31d45be4855565a6d13ce1da26036937f45d521c0b3fc070a240e301eff6"} Dec 01 15:07:05 crc kubenswrapper[4931]: I1201 15:07:05.626360 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:05 crc kubenswrapper[4931]: I1201 15:07:05.628690 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" event={"ID":"0e227039-dfcf-4e4f-ba5d-0d578c633e3b","Type":"ContainerStarted","Data":"47a8558536ba7bd8016a04118193a64c0bae8baac8e31e79c7dbe32b2f1ca89b"} Dec 01 15:07:05 crc kubenswrapper[4931]: I1201 15:07:05.628983 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:05 crc kubenswrapper[4931]: I1201 15:07:05.631551 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" Dec 01 15:07:05 crc kubenswrapper[4931]: I1201 15:07:05.633724 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" Dec 01 15:07:05 crc kubenswrapper[4931]: I1201 15:07:05.651972 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-769c67ccd8-qlhl2" podStartSLOduration=3.651947436 podStartE2EDuration="3.651947436s" podCreationTimestamp="2025-12-01 15:07:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:07:05.64824794 +0000 UTC m=+372.074121627" watchObservedRunningTime="2025-12-01 15:07:05.651947436 +0000 UTC m=+372.077821103" Dec 01 15:07:05 crc kubenswrapper[4931]: I1201 15:07:05.715335 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54ccdf98c5-gdb9q" podStartSLOduration=3.715288734 podStartE2EDuration="3.715288734s" podCreationTimestamp="2025-12-01 15:07:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:07:05.711532576 +0000 UTC m=+372.137406253" watchObservedRunningTime="2025-12-01 15:07:05.715288734 +0000 UTC m=+372.141162401" Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.871881 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfgd5"] Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.873296 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sfgd5" podUID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" containerName="registry-server" containerID="cri-o://c29ec789d001949820e8c5fbc81e726543466410fd2d7bb467285ebdda30965b" gracePeriod=30 Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.876458 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p62ld"] Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.876826 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p62ld" podUID="eeec19ab-af88-4a66-8414-a15046f37aaf" containerName="registry-server" containerID="cri-o://f93588e600fd06bcb71f02640b4ed8fb923c8911a84f74cb1ef3f71b5e756b61" gracePeriod=30 Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.891855 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gvtnh"] Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.892238 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" podUID="98a131c9-fc6c-4a27-a774-227258b380c0" containerName="marketplace-operator" containerID="cri-o://0cdc2764b445ae2f9bf7c48c706b9c0078cf6bc2aab4ecb64ca04314c92a4e35" gracePeriod=30 Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.908062 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vldw"] Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.908433 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9vldw" podUID="6d290cb6-63ef-49e0-8772-c74447b6fcff" containerName="registry-server" containerID="cri-o://94895ea75731a7a0bce9173168e2cdf8f66df1bf6c07719ad4f13132cba35795" gracePeriod=30 Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.939191 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-47bmf"] Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.940073 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-47bmf" podUID="8409825a-f886-4896-9c0c-919d12b3761c" containerName="registry-server" containerID="cri-o://1465d790ef7c2c06790346fd3cae6f9da1b256cd63beff2695d5b49dd359b18f" gracePeriod=30 Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.950003 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8tfrh"] Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.974111 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8tfrh"] Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.974290 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.988872 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d87b0f89-7ea5-4550-bace-1df5f7c508db-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8tfrh\" (UID: \"d87b0f89-7ea5-4550-bace-1df5f7c508db\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.988942 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d87b0f89-7ea5-4550-bace-1df5f7c508db-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8tfrh\" (UID: \"d87b0f89-7ea5-4550-bace-1df5f7c508db\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" Dec 01 15:07:13 crc kubenswrapper[4931]: I1201 15:07:13.988988 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk84j\" (UniqueName: \"kubernetes.io/projected/d87b0f89-7ea5-4550-bace-1df5f7c508db-kube-api-access-xk84j\") pod \"marketplace-operator-79b997595-8tfrh\" (UID: \"d87b0f89-7ea5-4550-bace-1df5f7c508db\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.089440 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk84j\" (UniqueName: \"kubernetes.io/projected/d87b0f89-7ea5-4550-bace-1df5f7c508db-kube-api-access-xk84j\") pod \"marketplace-operator-79b997595-8tfrh\" (UID: \"d87b0f89-7ea5-4550-bace-1df5f7c508db\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.089501 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d87b0f89-7ea5-4550-bace-1df5f7c508db-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8tfrh\" (UID: \"d87b0f89-7ea5-4550-bace-1df5f7c508db\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.089542 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d87b0f89-7ea5-4550-bace-1df5f7c508db-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8tfrh\" (UID: \"d87b0f89-7ea5-4550-bace-1df5f7c508db\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.091006 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d87b0f89-7ea5-4550-bace-1df5f7c508db-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8tfrh\" (UID: \"d87b0f89-7ea5-4550-bace-1df5f7c508db\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.106563 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d87b0f89-7ea5-4550-bace-1df5f7c508db-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8tfrh\" (UID: \"d87b0f89-7ea5-4550-bace-1df5f7c508db\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.112753 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk84j\" (UniqueName: \"kubernetes.io/projected/d87b0f89-7ea5-4550-bace-1df5f7c508db-kube-api-access-xk84j\") pod \"marketplace-operator-79b997595-8tfrh\" (UID: \"d87b0f89-7ea5-4550-bace-1df5f7c508db\") " pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.377197 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.414097 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.599298 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3667d33f-0665-4c2a-bbed-a160c3d48ed9-catalog-content\") pod \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\" (UID: \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.599374 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3667d33f-0665-4c2a-bbed-a160c3d48ed9-utilities\") pod \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\" (UID: \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.599532 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27rkm\" (UniqueName: \"kubernetes.io/projected/3667d33f-0665-4c2a-bbed-a160c3d48ed9-kube-api-access-27rkm\") pod \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\" (UID: \"3667d33f-0665-4c2a-bbed-a160c3d48ed9\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.602980 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3667d33f-0665-4c2a-bbed-a160c3d48ed9-utilities" (OuterVolumeSpecName: "utilities") pod "3667d33f-0665-4c2a-bbed-a160c3d48ed9" (UID: "3667d33f-0665-4c2a-bbed-a160c3d48ed9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.608946 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3667d33f-0665-4c2a-bbed-a160c3d48ed9-kube-api-access-27rkm" (OuterVolumeSpecName: "kube-api-access-27rkm") pod "3667d33f-0665-4c2a-bbed-a160c3d48ed9" (UID: "3667d33f-0665-4c2a-bbed-a160c3d48ed9"). InnerVolumeSpecName "kube-api-access-27rkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.649526 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3667d33f-0665-4c2a-bbed-a160c3d48ed9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3667d33f-0665-4c2a-bbed-a160c3d48ed9" (UID: "3667d33f-0665-4c2a-bbed-a160c3d48ed9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.679329 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.701288 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27rkm\" (UniqueName: \"kubernetes.io/projected/3667d33f-0665-4c2a-bbed-a160c3d48ed9-kube-api-access-27rkm\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.701321 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3667d33f-0665-4c2a-bbed-a160c3d48ed9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.701332 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3667d33f-0665-4c2a-bbed-a160c3d48ed9-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.701595 4931 generic.go:334] "Generic (PLEG): container finished" podID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" containerID="c29ec789d001949820e8c5fbc81e726543466410fd2d7bb467285ebdda30965b" exitCode=0 Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.701692 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfgd5" event={"ID":"3667d33f-0665-4c2a-bbed-a160c3d48ed9","Type":"ContainerDied","Data":"c29ec789d001949820e8c5fbc81e726543466410fd2d7bb467285ebdda30965b"} Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.701733 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfgd5" event={"ID":"3667d33f-0665-4c2a-bbed-a160c3d48ed9","Type":"ContainerDied","Data":"883af56b468fa3697ed258b11906263712bf18789f4f0a4602cb0b51dac58a6e"} Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.701754 4931 scope.go:117] "RemoveContainer" containerID="c29ec789d001949820e8c5fbc81e726543466410fd2d7bb467285ebdda30965b" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.701948 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfgd5" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.708943 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.709033 4931 generic.go:334] "Generic (PLEG): container finished" podID="eeec19ab-af88-4a66-8414-a15046f37aaf" containerID="f93588e600fd06bcb71f02640b4ed8fb923c8911a84f74cb1ef3f71b5e756b61" exitCode=0 Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.709117 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p62ld" event={"ID":"eeec19ab-af88-4a66-8414-a15046f37aaf","Type":"ContainerDied","Data":"f93588e600fd06bcb71f02640b4ed8fb923c8911a84f74cb1ef3f71b5e756b61"} Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.709143 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p62ld" event={"ID":"eeec19ab-af88-4a66-8414-a15046f37aaf","Type":"ContainerDied","Data":"bb7119ac85ba7f2340a6faae4363e8bd68ea893d7592bec5f185ffd8ec101bbb"} Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.709192 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p62ld" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.715231 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.722665 4931 generic.go:334] "Generic (PLEG): container finished" podID="6d290cb6-63ef-49e0-8772-c74447b6fcff" containerID="94895ea75731a7a0bce9173168e2cdf8f66df1bf6c07719ad4f13132cba35795" exitCode=0 Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.722734 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vldw" event={"ID":"6d290cb6-63ef-49e0-8772-c74447b6fcff","Type":"ContainerDied","Data":"94895ea75731a7a0bce9173168e2cdf8f66df1bf6c07719ad4f13132cba35795"} Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.731955 4931 generic.go:334] "Generic (PLEG): container finished" podID="98a131c9-fc6c-4a27-a774-227258b380c0" containerID="0cdc2764b445ae2f9bf7c48c706b9c0078cf6bc2aab4ecb64ca04314c92a4e35" exitCode=0 Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.732035 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" event={"ID":"98a131c9-fc6c-4a27-a774-227258b380c0","Type":"ContainerDied","Data":"0cdc2764b445ae2f9bf7c48c706b9c0078cf6bc2aab4ecb64ca04314c92a4e35"} Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.732122 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gvtnh" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.733224 4931 scope.go:117] "RemoveContainer" containerID="8bd48e8c46d602df9a36f0cfc421daf5a44c41924cbe4452647150fe95e01be0" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.735807 4931 generic.go:334] "Generic (PLEG): container finished" podID="8409825a-f886-4896-9c0c-919d12b3761c" containerID="1465d790ef7c2c06790346fd3cae6f9da1b256cd63beff2695d5b49dd359b18f" exitCode=0 Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.735866 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47bmf" event={"ID":"8409825a-f886-4896-9c0c-919d12b3761c","Type":"ContainerDied","Data":"1465d790ef7c2c06790346fd3cae6f9da1b256cd63beff2695d5b49dd359b18f"} Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.735916 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47bmf" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.756583 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.776879 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfgd5"] Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.777021 4931 scope.go:117] "RemoveContainer" containerID="64636f5b5bf5a856b2d263cb60a88f72767f859e98e4b4eaebc3f1d42e2abbeb" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.781959 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sfgd5"] Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.802073 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98a131c9-fc6c-4a27-a774-227258b380c0-marketplace-operator-metrics\") pod \"98a131c9-fc6c-4a27-a774-227258b380c0\" (UID: \"98a131c9-fc6c-4a27-a774-227258b380c0\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.802180 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98a131c9-fc6c-4a27-a774-227258b380c0-marketplace-trusted-ca\") pod \"98a131c9-fc6c-4a27-a774-227258b380c0\" (UID: \"98a131c9-fc6c-4a27-a774-227258b380c0\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.802233 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmc5g\" (UniqueName: \"kubernetes.io/projected/eeec19ab-af88-4a66-8414-a15046f37aaf-kube-api-access-wmc5g\") pod \"eeec19ab-af88-4a66-8414-a15046f37aaf\" (UID: \"eeec19ab-af88-4a66-8414-a15046f37aaf\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.802300 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r66vn\" (UniqueName: \"kubernetes.io/projected/98a131c9-fc6c-4a27-a774-227258b380c0-kube-api-access-r66vn\") pod \"98a131c9-fc6c-4a27-a774-227258b380c0\" (UID: \"98a131c9-fc6c-4a27-a774-227258b380c0\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.802348 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeec19ab-af88-4a66-8414-a15046f37aaf-utilities\") pod \"eeec19ab-af88-4a66-8414-a15046f37aaf\" (UID: \"eeec19ab-af88-4a66-8414-a15046f37aaf\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.802378 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeec19ab-af88-4a66-8414-a15046f37aaf-catalog-content\") pod \"eeec19ab-af88-4a66-8414-a15046f37aaf\" (UID: \"eeec19ab-af88-4a66-8414-a15046f37aaf\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.803958 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeec19ab-af88-4a66-8414-a15046f37aaf-utilities" (OuterVolumeSpecName: "utilities") pod "eeec19ab-af88-4a66-8414-a15046f37aaf" (UID: "eeec19ab-af88-4a66-8414-a15046f37aaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.804566 4931 scope.go:117] "RemoveContainer" containerID="c29ec789d001949820e8c5fbc81e726543466410fd2d7bb467285ebdda30965b" Dec 01 15:07:14 crc kubenswrapper[4931]: E1201 15:07:14.805183 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c29ec789d001949820e8c5fbc81e726543466410fd2d7bb467285ebdda30965b\": container with ID starting with c29ec789d001949820e8c5fbc81e726543466410fd2d7bb467285ebdda30965b not found: ID does not exist" containerID="c29ec789d001949820e8c5fbc81e726543466410fd2d7bb467285ebdda30965b" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.805214 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98a131c9-fc6c-4a27-a774-227258b380c0-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "98a131c9-fc6c-4a27-a774-227258b380c0" (UID: "98a131c9-fc6c-4a27-a774-227258b380c0"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.805253 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c29ec789d001949820e8c5fbc81e726543466410fd2d7bb467285ebdda30965b"} err="failed to get container status \"c29ec789d001949820e8c5fbc81e726543466410fd2d7bb467285ebdda30965b\": rpc error: code = NotFound desc = could not find container \"c29ec789d001949820e8c5fbc81e726543466410fd2d7bb467285ebdda30965b\": container with ID starting with c29ec789d001949820e8c5fbc81e726543466410fd2d7bb467285ebdda30965b not found: ID does not exist" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.805287 4931 scope.go:117] "RemoveContainer" containerID="8bd48e8c46d602df9a36f0cfc421daf5a44c41924cbe4452647150fe95e01be0" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.806969 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeec19ab-af88-4a66-8414-a15046f37aaf-kube-api-access-wmc5g" (OuterVolumeSpecName: "kube-api-access-wmc5g") pod "eeec19ab-af88-4a66-8414-a15046f37aaf" (UID: "eeec19ab-af88-4a66-8414-a15046f37aaf"). InnerVolumeSpecName "kube-api-access-wmc5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.807339 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98a131c9-fc6c-4a27-a774-227258b380c0-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "98a131c9-fc6c-4a27-a774-227258b380c0" (UID: "98a131c9-fc6c-4a27-a774-227258b380c0"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.808359 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98a131c9-fc6c-4a27-a774-227258b380c0-kube-api-access-r66vn" (OuterVolumeSpecName: "kube-api-access-r66vn") pod "98a131c9-fc6c-4a27-a774-227258b380c0" (UID: "98a131c9-fc6c-4a27-a774-227258b380c0"). InnerVolumeSpecName "kube-api-access-r66vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: E1201 15:07:14.808644 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bd48e8c46d602df9a36f0cfc421daf5a44c41924cbe4452647150fe95e01be0\": container with ID starting with 8bd48e8c46d602df9a36f0cfc421daf5a44c41924cbe4452647150fe95e01be0 not found: ID does not exist" containerID="8bd48e8c46d602df9a36f0cfc421daf5a44c41924cbe4452647150fe95e01be0" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.808698 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bd48e8c46d602df9a36f0cfc421daf5a44c41924cbe4452647150fe95e01be0"} err="failed to get container status \"8bd48e8c46d602df9a36f0cfc421daf5a44c41924cbe4452647150fe95e01be0\": rpc error: code = NotFound desc = could not find container \"8bd48e8c46d602df9a36f0cfc421daf5a44c41924cbe4452647150fe95e01be0\": container with ID starting with 8bd48e8c46d602df9a36f0cfc421daf5a44c41924cbe4452647150fe95e01be0 not found: ID does not exist" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.808732 4931 scope.go:117] "RemoveContainer" containerID="64636f5b5bf5a856b2d263cb60a88f72767f859e98e4b4eaebc3f1d42e2abbeb" Dec 01 15:07:14 crc kubenswrapper[4931]: E1201 15:07:14.809180 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64636f5b5bf5a856b2d263cb60a88f72767f859e98e4b4eaebc3f1d42e2abbeb\": container with ID starting with 64636f5b5bf5a856b2d263cb60a88f72767f859e98e4b4eaebc3f1d42e2abbeb not found: ID does not exist" containerID="64636f5b5bf5a856b2d263cb60a88f72767f859e98e4b4eaebc3f1d42e2abbeb" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.809206 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64636f5b5bf5a856b2d263cb60a88f72767f859e98e4b4eaebc3f1d42e2abbeb"} err="failed to get container status \"64636f5b5bf5a856b2d263cb60a88f72767f859e98e4b4eaebc3f1d42e2abbeb\": rpc error: code = NotFound desc = could not find container \"64636f5b5bf5a856b2d263cb60a88f72767f859e98e4b4eaebc3f1d42e2abbeb\": container with ID starting with 64636f5b5bf5a856b2d263cb60a88f72767f859e98e4b4eaebc3f1d42e2abbeb not found: ID does not exist" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.809220 4931 scope.go:117] "RemoveContainer" containerID="f93588e600fd06bcb71f02640b4ed8fb923c8911a84f74cb1ef3f71b5e756b61" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.823198 4931 scope.go:117] "RemoveContainer" containerID="57f6ada2f3f43414e74b48a29a9c83fecc09b549e6a051e69f564eea9d99baa0" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.839203 4931 scope.go:117] "RemoveContainer" containerID="e7e4ed2ecd850efb4c288fc7fb3e313509224d677f5f0cc807f59b4ad0edb5f2" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.852070 4931 scope.go:117] "RemoveContainer" containerID="f93588e600fd06bcb71f02640b4ed8fb923c8911a84f74cb1ef3f71b5e756b61" Dec 01 15:07:14 crc kubenswrapper[4931]: E1201 15:07:14.852768 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f93588e600fd06bcb71f02640b4ed8fb923c8911a84f74cb1ef3f71b5e756b61\": container with ID starting with f93588e600fd06bcb71f02640b4ed8fb923c8911a84f74cb1ef3f71b5e756b61 not found: ID does not exist" containerID="f93588e600fd06bcb71f02640b4ed8fb923c8911a84f74cb1ef3f71b5e756b61" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.852799 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f93588e600fd06bcb71f02640b4ed8fb923c8911a84f74cb1ef3f71b5e756b61"} err="failed to get container status \"f93588e600fd06bcb71f02640b4ed8fb923c8911a84f74cb1ef3f71b5e756b61\": rpc error: code = NotFound desc = could not find container \"f93588e600fd06bcb71f02640b4ed8fb923c8911a84f74cb1ef3f71b5e756b61\": container with ID starting with f93588e600fd06bcb71f02640b4ed8fb923c8911a84f74cb1ef3f71b5e756b61 not found: ID does not exist" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.852828 4931 scope.go:117] "RemoveContainer" containerID="57f6ada2f3f43414e74b48a29a9c83fecc09b549e6a051e69f564eea9d99baa0" Dec 01 15:07:14 crc kubenswrapper[4931]: E1201 15:07:14.853173 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57f6ada2f3f43414e74b48a29a9c83fecc09b549e6a051e69f564eea9d99baa0\": container with ID starting with 57f6ada2f3f43414e74b48a29a9c83fecc09b549e6a051e69f564eea9d99baa0 not found: ID does not exist" containerID="57f6ada2f3f43414e74b48a29a9c83fecc09b549e6a051e69f564eea9d99baa0" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.853216 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f6ada2f3f43414e74b48a29a9c83fecc09b549e6a051e69f564eea9d99baa0"} err="failed to get container status \"57f6ada2f3f43414e74b48a29a9c83fecc09b549e6a051e69f564eea9d99baa0\": rpc error: code = NotFound desc = could not find container \"57f6ada2f3f43414e74b48a29a9c83fecc09b549e6a051e69f564eea9d99baa0\": container with ID starting with 57f6ada2f3f43414e74b48a29a9c83fecc09b549e6a051e69f564eea9d99baa0 not found: ID does not exist" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.853248 4931 scope.go:117] "RemoveContainer" containerID="e7e4ed2ecd850efb4c288fc7fb3e313509224d677f5f0cc807f59b4ad0edb5f2" Dec 01 15:07:14 crc kubenswrapper[4931]: E1201 15:07:14.853594 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7e4ed2ecd850efb4c288fc7fb3e313509224d677f5f0cc807f59b4ad0edb5f2\": container with ID starting with e7e4ed2ecd850efb4c288fc7fb3e313509224d677f5f0cc807f59b4ad0edb5f2 not found: ID does not exist" containerID="e7e4ed2ecd850efb4c288fc7fb3e313509224d677f5f0cc807f59b4ad0edb5f2" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.853650 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e4ed2ecd850efb4c288fc7fb3e313509224d677f5f0cc807f59b4ad0edb5f2"} err="failed to get container status \"e7e4ed2ecd850efb4c288fc7fb3e313509224d677f5f0cc807f59b4ad0edb5f2\": rpc error: code = NotFound desc = could not find container \"e7e4ed2ecd850efb4c288fc7fb3e313509224d677f5f0cc807f59b4ad0edb5f2\": container with ID starting with e7e4ed2ecd850efb4c288fc7fb3e313509224d677f5f0cc807f59b4ad0edb5f2 not found: ID does not exist" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.853681 4931 scope.go:117] "RemoveContainer" containerID="0cdc2764b445ae2f9bf7c48c706b9c0078cf6bc2aab4ecb64ca04314c92a4e35" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.854739 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeec19ab-af88-4a66-8414-a15046f37aaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eeec19ab-af88-4a66-8414-a15046f37aaf" (UID: "eeec19ab-af88-4a66-8414-a15046f37aaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.867677 4931 scope.go:117] "RemoveContainer" containerID="9dc35aace1b82369a302f408af3779655c31571b76bee8ff330872097cb54890" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.883620 4931 scope.go:117] "RemoveContainer" containerID="1465d790ef7c2c06790346fd3cae6f9da1b256cd63beff2695d5b49dd359b18f" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.903828 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8409825a-f886-4896-9c0c-919d12b3761c-utilities\") pod \"8409825a-f886-4896-9c0c-919d12b3761c\" (UID: \"8409825a-f886-4896-9c0c-919d12b3761c\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.903983 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d290cb6-63ef-49e0-8772-c74447b6fcff-utilities\") pod \"6d290cb6-63ef-49e0-8772-c74447b6fcff\" (UID: \"6d290cb6-63ef-49e0-8772-c74447b6fcff\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.904045 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d290cb6-63ef-49e0-8772-c74447b6fcff-catalog-content\") pod \"6d290cb6-63ef-49e0-8772-c74447b6fcff\" (UID: \"6d290cb6-63ef-49e0-8772-c74447b6fcff\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.904094 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8409825a-f886-4896-9c0c-919d12b3761c-catalog-content\") pod \"8409825a-f886-4896-9c0c-919d12b3761c\" (UID: \"8409825a-f886-4896-9c0c-919d12b3761c\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.904170 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vch6b\" (UniqueName: \"kubernetes.io/projected/8409825a-f886-4896-9c0c-919d12b3761c-kube-api-access-vch6b\") pod \"8409825a-f886-4896-9c0c-919d12b3761c\" (UID: \"8409825a-f886-4896-9c0c-919d12b3761c\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.904197 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh22s\" (UniqueName: \"kubernetes.io/projected/6d290cb6-63ef-49e0-8772-c74447b6fcff-kube-api-access-nh22s\") pod \"6d290cb6-63ef-49e0-8772-c74447b6fcff\" (UID: \"6d290cb6-63ef-49e0-8772-c74447b6fcff\") " Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.904473 4931 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/98a131c9-fc6c-4a27-a774-227258b380c0-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.904493 4931 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98a131c9-fc6c-4a27-a774-227258b380c0-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.904505 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmc5g\" (UniqueName: \"kubernetes.io/projected/eeec19ab-af88-4a66-8414-a15046f37aaf-kube-api-access-wmc5g\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.904518 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r66vn\" (UniqueName: \"kubernetes.io/projected/98a131c9-fc6c-4a27-a774-227258b380c0-kube-api-access-r66vn\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.904530 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeec19ab-af88-4a66-8414-a15046f37aaf-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.904541 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeec19ab-af88-4a66-8414-a15046f37aaf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.906478 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8409825a-f886-4896-9c0c-919d12b3761c-utilities" (OuterVolumeSpecName: "utilities") pod "8409825a-f886-4896-9c0c-919d12b3761c" (UID: "8409825a-f886-4896-9c0c-919d12b3761c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.906969 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d290cb6-63ef-49e0-8772-c74447b6fcff-utilities" (OuterVolumeSpecName: "utilities") pod "6d290cb6-63ef-49e0-8772-c74447b6fcff" (UID: "6d290cb6-63ef-49e0-8772-c74447b6fcff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.907909 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d290cb6-63ef-49e0-8772-c74447b6fcff-kube-api-access-nh22s" (OuterVolumeSpecName: "kube-api-access-nh22s") pod "6d290cb6-63ef-49e0-8772-c74447b6fcff" (UID: "6d290cb6-63ef-49e0-8772-c74447b6fcff"). InnerVolumeSpecName "kube-api-access-nh22s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.930687 4931 scope.go:117] "RemoveContainer" containerID="433449a5079cd5eabe25db8c690cccdac271038428004b87d57deaa093955913" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.930939 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8409825a-f886-4896-9c0c-919d12b3761c-kube-api-access-vch6b" (OuterVolumeSpecName: "kube-api-access-vch6b") pod "8409825a-f886-4896-9c0c-919d12b3761c" (UID: "8409825a-f886-4896-9c0c-919d12b3761c"). InnerVolumeSpecName "kube-api-access-vch6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.942825 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8tfrh"] Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.944229 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d290cb6-63ef-49e0-8772-c74447b6fcff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d290cb6-63ef-49e0-8772-c74447b6fcff" (UID: "6d290cb6-63ef-49e0-8772-c74447b6fcff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:07:14 crc kubenswrapper[4931]: I1201 15:07:14.968637 4931 scope.go:117] "RemoveContainer" containerID="e1ba5a4f789fad0996cecf5c6b77e655605e7f3e32b9f904253bf23c7ac011cb" Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.005487 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d290cb6-63ef-49e0-8772-c74447b6fcff-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.005524 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vch6b\" (UniqueName: \"kubernetes.io/projected/8409825a-f886-4896-9c0c-919d12b3761c-kube-api-access-vch6b\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.005541 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh22s\" (UniqueName: \"kubernetes.io/projected/6d290cb6-63ef-49e0-8772-c74447b6fcff-kube-api-access-nh22s\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.005553 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8409825a-f886-4896-9c0c-919d12b3761c-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.005567 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d290cb6-63ef-49e0-8772-c74447b6fcff-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.035353 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8409825a-f886-4896-9c0c-919d12b3761c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8409825a-f886-4896-9c0c-919d12b3761c" (UID: "8409825a-f886-4896-9c0c-919d12b3761c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.082273 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p62ld"] Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.086350 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p62ld"] Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.096102 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-47bmf"] Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.100051 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-47bmf"] Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.107952 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8409825a-f886-4896-9c0c-919d12b3761c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.111582 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gvtnh"] Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.119109 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gvtnh"] Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.746767 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" event={"ID":"d87b0f89-7ea5-4550-bace-1df5f7c508db","Type":"ContainerStarted","Data":"0726874ce0ad7998fa81365e301eb67779f6ba1a7a4d232d1e5627b38313b586"} Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.746826 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" event={"ID":"d87b0f89-7ea5-4550-bace-1df5f7c508db","Type":"ContainerStarted","Data":"c18e2ee11f25baec868959cf86adc82cdedf2120686e90be25572f4c0912b17d"} Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.750001 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vldw" event={"ID":"6d290cb6-63ef-49e0-8772-c74447b6fcff","Type":"ContainerDied","Data":"db853f56b468c51526b7c819de767a307a4981de89b4b80715f49f44039979b2"} Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.750062 4931 scope.go:117] "RemoveContainer" containerID="94895ea75731a7a0bce9173168e2cdf8f66df1bf6c07719ad4f13132cba35795" Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.750124 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vldw" Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.781586 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" podStartSLOduration=2.781557925 podStartE2EDuration="2.781557925s" podCreationTimestamp="2025-12-01 15:07:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:07:15.776971773 +0000 UTC m=+382.202845460" watchObservedRunningTime="2025-12-01 15:07:15.781557925 +0000 UTC m=+382.207431602" Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.781831 4931 scope.go:117] "RemoveContainer" containerID="514501bdcfd5b8e8ec60f6c82e231d32a0c42738849e2284e7e45cc580d2fcc3" Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.809035 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vldw"] Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.820582 4931 scope.go:117] "RemoveContainer" containerID="b27d9c4336a03ae4746269cb17457b2de493d78317bb797fb0e9dde1ee4e3d90" Dec 01 15:07:15 crc kubenswrapper[4931]: I1201 15:07:15.826642 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vldw"] Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.086923 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h8422"] Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.087795 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeec19ab-af88-4a66-8414-a15046f37aaf" containerName="extract-utilities" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.087812 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeec19ab-af88-4a66-8414-a15046f37aaf" containerName="extract-utilities" Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.087826 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8409825a-f886-4896-9c0c-919d12b3761c" containerName="extract-content" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.087834 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="8409825a-f886-4896-9c0c-919d12b3761c" containerName="extract-content" Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.087842 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeec19ab-af88-4a66-8414-a15046f37aaf" containerName="extract-content" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.087854 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeec19ab-af88-4a66-8414-a15046f37aaf" containerName="extract-content" Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.087867 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8409825a-f886-4896-9c0c-919d12b3761c" containerName="extract-utilities" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.087874 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="8409825a-f886-4896-9c0c-919d12b3761c" containerName="extract-utilities" Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.087884 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d290cb6-63ef-49e0-8772-c74447b6fcff" containerName="extract-utilities" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.087890 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d290cb6-63ef-49e0-8772-c74447b6fcff" containerName="extract-utilities" Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.087902 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8409825a-f886-4896-9c0c-919d12b3761c" containerName="registry-server" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.087909 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="8409825a-f886-4896-9c0c-919d12b3761c" containerName="registry-server" Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.087922 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a131c9-fc6c-4a27-a774-227258b380c0" containerName="marketplace-operator" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.087929 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a131c9-fc6c-4a27-a774-227258b380c0" containerName="marketplace-operator" Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.087939 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d290cb6-63ef-49e0-8772-c74447b6fcff" containerName="extract-content" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.087946 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d290cb6-63ef-49e0-8772-c74447b6fcff" containerName="extract-content" Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.087957 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d290cb6-63ef-49e0-8772-c74447b6fcff" containerName="registry-server" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.087964 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d290cb6-63ef-49e0-8772-c74447b6fcff" containerName="registry-server" Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.087972 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeec19ab-af88-4a66-8414-a15046f37aaf" containerName="registry-server" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.087979 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeec19ab-af88-4a66-8414-a15046f37aaf" containerName="registry-server" Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.087993 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98a131c9-fc6c-4a27-a774-227258b380c0" containerName="marketplace-operator" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.088000 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="98a131c9-fc6c-4a27-a774-227258b380c0" containerName="marketplace-operator" Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.088012 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" containerName="registry-server" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.088019 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" containerName="registry-server" Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.088034 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" containerName="extract-content" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.088040 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" containerName="extract-content" Dec 01 15:07:16 crc kubenswrapper[4931]: E1201 15:07:16.088053 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" containerName="extract-utilities" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.088060 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" containerName="extract-utilities" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.088163 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeec19ab-af88-4a66-8414-a15046f37aaf" containerName="registry-server" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.088177 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="98a131c9-fc6c-4a27-a774-227258b380c0" containerName="marketplace-operator" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.088189 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="8409825a-f886-4896-9c0c-919d12b3761c" containerName="registry-server" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.088199 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" containerName="registry-server" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.088205 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d290cb6-63ef-49e0-8772-c74447b6fcff" containerName="registry-server" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.088213 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="98a131c9-fc6c-4a27-a774-227258b380c0" containerName="marketplace-operator" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.089081 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.093030 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.104505 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h8422"] Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.229986 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k99gn\" (UniqueName: \"kubernetes.io/projected/eda61d02-1a5b-4602-935e-04ded0ccddb5-kube-api-access-k99gn\") pod \"certified-operators-h8422\" (UID: \"eda61d02-1a5b-4602-935e-04ded0ccddb5\") " pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.230047 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda61d02-1a5b-4602-935e-04ded0ccddb5-utilities\") pod \"certified-operators-h8422\" (UID: \"eda61d02-1a5b-4602-935e-04ded0ccddb5\") " pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.230091 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda61d02-1a5b-4602-935e-04ded0ccddb5-catalog-content\") pod \"certified-operators-h8422\" (UID: \"eda61d02-1a5b-4602-935e-04ded0ccddb5\") " pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.250051 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3667d33f-0665-4c2a-bbed-a160c3d48ed9" path="/var/lib/kubelet/pods/3667d33f-0665-4c2a-bbed-a160c3d48ed9/volumes" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.250882 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d290cb6-63ef-49e0-8772-c74447b6fcff" path="/var/lib/kubelet/pods/6d290cb6-63ef-49e0-8772-c74447b6fcff/volumes" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.252011 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8409825a-f886-4896-9c0c-919d12b3761c" path="/var/lib/kubelet/pods/8409825a-f886-4896-9c0c-919d12b3761c/volumes" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.253774 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98a131c9-fc6c-4a27-a774-227258b380c0" path="/var/lib/kubelet/pods/98a131c9-fc6c-4a27-a774-227258b380c0/volumes" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.254622 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeec19ab-af88-4a66-8414-a15046f37aaf" path="/var/lib/kubelet/pods/eeec19ab-af88-4a66-8414-a15046f37aaf/volumes" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.289454 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k5l6c"] Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.292589 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.296654 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.300930 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5l6c"] Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.331365 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k99gn\" (UniqueName: \"kubernetes.io/projected/eda61d02-1a5b-4602-935e-04ded0ccddb5-kube-api-access-k99gn\") pod \"certified-operators-h8422\" (UID: \"eda61d02-1a5b-4602-935e-04ded0ccddb5\") " pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.331452 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda61d02-1a5b-4602-935e-04ded0ccddb5-utilities\") pod \"certified-operators-h8422\" (UID: \"eda61d02-1a5b-4602-935e-04ded0ccddb5\") " pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.331493 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda61d02-1a5b-4602-935e-04ded0ccddb5-catalog-content\") pod \"certified-operators-h8422\" (UID: \"eda61d02-1a5b-4602-935e-04ded0ccddb5\") " pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.332028 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda61d02-1a5b-4602-935e-04ded0ccddb5-utilities\") pod \"certified-operators-h8422\" (UID: \"eda61d02-1a5b-4602-935e-04ded0ccddb5\") " pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.332284 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda61d02-1a5b-4602-935e-04ded0ccddb5-catalog-content\") pod \"certified-operators-h8422\" (UID: \"eda61d02-1a5b-4602-935e-04ded0ccddb5\") " pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.357557 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k99gn\" (UniqueName: \"kubernetes.io/projected/eda61d02-1a5b-4602-935e-04ded0ccddb5-kube-api-access-k99gn\") pod \"certified-operators-h8422\" (UID: \"eda61d02-1a5b-4602-935e-04ded0ccddb5\") " pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.420971 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.432456 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60c5b6d-f16e-4b68-a761-76678ada1930-catalog-content\") pod \"redhat-marketplace-k5l6c\" (UID: \"b60c5b6d-f16e-4b68-a761-76678ada1930\") " pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.432503 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60c5b6d-f16e-4b68-a761-76678ada1930-utilities\") pod \"redhat-marketplace-k5l6c\" (UID: \"b60c5b6d-f16e-4b68-a761-76678ada1930\") " pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.432538 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8fsx\" (UniqueName: \"kubernetes.io/projected/b60c5b6d-f16e-4b68-a761-76678ada1930-kube-api-access-x8fsx\") pod \"redhat-marketplace-k5l6c\" (UID: \"b60c5b6d-f16e-4b68-a761-76678ada1930\") " pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.534470 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60c5b6d-f16e-4b68-a761-76678ada1930-utilities\") pod \"redhat-marketplace-k5l6c\" (UID: \"b60c5b6d-f16e-4b68-a761-76678ada1930\") " pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.534722 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8fsx\" (UniqueName: \"kubernetes.io/projected/b60c5b6d-f16e-4b68-a761-76678ada1930-kube-api-access-x8fsx\") pod \"redhat-marketplace-k5l6c\" (UID: \"b60c5b6d-f16e-4b68-a761-76678ada1930\") " pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.534773 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60c5b6d-f16e-4b68-a761-76678ada1930-catalog-content\") pod \"redhat-marketplace-k5l6c\" (UID: \"b60c5b6d-f16e-4b68-a761-76678ada1930\") " pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.535209 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60c5b6d-f16e-4b68-a761-76678ada1930-catalog-content\") pod \"redhat-marketplace-k5l6c\" (UID: \"b60c5b6d-f16e-4b68-a761-76678ada1930\") " pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.535459 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60c5b6d-f16e-4b68-a761-76678ada1930-utilities\") pod \"redhat-marketplace-k5l6c\" (UID: \"b60c5b6d-f16e-4b68-a761-76678ada1930\") " pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.562744 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8fsx\" (UniqueName: \"kubernetes.io/projected/b60c5b6d-f16e-4b68-a761-76678ada1930-kube-api-access-x8fsx\") pod \"redhat-marketplace-k5l6c\" (UID: \"b60c5b6d-f16e-4b68-a761-76678ada1930\") " pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.618304 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.773867 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.778953 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8tfrh" Dec 01 15:07:16 crc kubenswrapper[4931]: I1201 15:07:16.880793 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h8422"] Dec 01 15:07:17 crc kubenswrapper[4931]: I1201 15:07:17.032725 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5l6c"] Dec 01 15:07:17 crc kubenswrapper[4931]: W1201 15:07:17.045844 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb60c5b6d_f16e_4b68_a761_76678ada1930.slice/crio-2acd9994f7c96873b389815015a79c86dab43d4691cc7c243dd4b9e82a9f464b WatchSource:0}: Error finding container 2acd9994f7c96873b389815015a79c86dab43d4691cc7c243dd4b9e82a9f464b: Status 404 returned error can't find the container with id 2acd9994f7c96873b389815015a79c86dab43d4691cc7c243dd4b9e82a9f464b Dec 01 15:07:17 crc kubenswrapper[4931]: I1201 15:07:17.782260 4931 generic.go:334] "Generic (PLEG): container finished" podID="eda61d02-1a5b-4602-935e-04ded0ccddb5" containerID="f24a3b111989b4f765f7cd754426d8e7b2b224bf9ea25793936f5e3085a2534e" exitCode=0 Dec 01 15:07:17 crc kubenswrapper[4931]: I1201 15:07:17.782473 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8422" event={"ID":"eda61d02-1a5b-4602-935e-04ded0ccddb5","Type":"ContainerDied","Data":"f24a3b111989b4f765f7cd754426d8e7b2b224bf9ea25793936f5e3085a2534e"} Dec 01 15:07:17 crc kubenswrapper[4931]: I1201 15:07:17.782652 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8422" event={"ID":"eda61d02-1a5b-4602-935e-04ded0ccddb5","Type":"ContainerStarted","Data":"c1792ea1df3caa0e9cc1068ab3dbacecffc55391dc02ca0b8b602c930aa78cb8"} Dec 01 15:07:17 crc kubenswrapper[4931]: I1201 15:07:17.785488 4931 generic.go:334] "Generic (PLEG): container finished" podID="b60c5b6d-f16e-4b68-a761-76678ada1930" containerID="f53a38d27556c283e324fd0bf7b5961a537dee37ba802762705e523bd0e82b25" exitCode=0 Dec 01 15:07:17 crc kubenswrapper[4931]: I1201 15:07:17.787110 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5l6c" event={"ID":"b60c5b6d-f16e-4b68-a761-76678ada1930","Type":"ContainerDied","Data":"f53a38d27556c283e324fd0bf7b5961a537dee37ba802762705e523bd0e82b25"} Dec 01 15:07:17 crc kubenswrapper[4931]: I1201 15:07:17.787147 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5l6c" event={"ID":"b60c5b6d-f16e-4b68-a761-76678ada1930","Type":"ContainerStarted","Data":"2acd9994f7c96873b389815015a79c86dab43d4691cc7c243dd4b9e82a9f464b"} Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.485738 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rpwj9"] Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.489803 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.493334 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.498942 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rpwj9"] Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.668446 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83ac02de-c63c-4b14-bcbe-28e0ef9d91f9-catalog-content\") pod \"community-operators-rpwj9\" (UID: \"83ac02de-c63c-4b14-bcbe-28e0ef9d91f9\") " pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.668525 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83ac02de-c63c-4b14-bcbe-28e0ef9d91f9-utilities\") pod \"community-operators-rpwj9\" (UID: \"83ac02de-c63c-4b14-bcbe-28e0ef9d91f9\") " pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.668576 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd42g\" (UniqueName: \"kubernetes.io/projected/83ac02de-c63c-4b14-bcbe-28e0ef9d91f9-kube-api-access-gd42g\") pod \"community-operators-rpwj9\" (UID: \"83ac02de-c63c-4b14-bcbe-28e0ef9d91f9\") " pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.680837 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wxd5j"] Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.681911 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.686571 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.704037 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wxd5j"] Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.770477 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3575355-a4fc-4f78-8512-79f2fb4bd449-utilities\") pod \"redhat-operators-wxd5j\" (UID: \"c3575355-a4fc-4f78-8512-79f2fb4bd449\") " pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.770533 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83ac02de-c63c-4b14-bcbe-28e0ef9d91f9-catalog-content\") pod \"community-operators-rpwj9\" (UID: \"83ac02de-c63c-4b14-bcbe-28e0ef9d91f9\") " pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.770567 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83ac02de-c63c-4b14-bcbe-28e0ef9d91f9-utilities\") pod \"community-operators-rpwj9\" (UID: \"83ac02de-c63c-4b14-bcbe-28e0ef9d91f9\") " pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.770585 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3575355-a4fc-4f78-8512-79f2fb4bd449-catalog-content\") pod \"redhat-operators-wxd5j\" (UID: \"c3575355-a4fc-4f78-8512-79f2fb4bd449\") " pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.770612 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvnzq\" (UniqueName: \"kubernetes.io/projected/c3575355-a4fc-4f78-8512-79f2fb4bd449-kube-api-access-gvnzq\") pod \"redhat-operators-wxd5j\" (UID: \"c3575355-a4fc-4f78-8512-79f2fb4bd449\") " pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.770642 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd42g\" (UniqueName: \"kubernetes.io/projected/83ac02de-c63c-4b14-bcbe-28e0ef9d91f9-kube-api-access-gd42g\") pod \"community-operators-rpwj9\" (UID: \"83ac02de-c63c-4b14-bcbe-28e0ef9d91f9\") " pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.771342 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83ac02de-c63c-4b14-bcbe-28e0ef9d91f9-catalog-content\") pod \"community-operators-rpwj9\" (UID: \"83ac02de-c63c-4b14-bcbe-28e0ef9d91f9\") " pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.771592 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83ac02de-c63c-4b14-bcbe-28e0ef9d91f9-utilities\") pod \"community-operators-rpwj9\" (UID: \"83ac02de-c63c-4b14-bcbe-28e0ef9d91f9\") " pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.807280 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd42g\" (UniqueName: \"kubernetes.io/projected/83ac02de-c63c-4b14-bcbe-28e0ef9d91f9-kube-api-access-gd42g\") pod \"community-operators-rpwj9\" (UID: \"83ac02de-c63c-4b14-bcbe-28e0ef9d91f9\") " pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.871868 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.872615 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3575355-a4fc-4f78-8512-79f2fb4bd449-utilities\") pod \"redhat-operators-wxd5j\" (UID: \"c3575355-a4fc-4f78-8512-79f2fb4bd449\") " pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.872725 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3575355-a4fc-4f78-8512-79f2fb4bd449-catalog-content\") pod \"redhat-operators-wxd5j\" (UID: \"c3575355-a4fc-4f78-8512-79f2fb4bd449\") " pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.872782 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvnzq\" (UniqueName: \"kubernetes.io/projected/c3575355-a4fc-4f78-8512-79f2fb4bd449-kube-api-access-gvnzq\") pod \"redhat-operators-wxd5j\" (UID: \"c3575355-a4fc-4f78-8512-79f2fb4bd449\") " pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.874089 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3575355-a4fc-4f78-8512-79f2fb4bd449-utilities\") pod \"redhat-operators-wxd5j\" (UID: \"c3575355-a4fc-4f78-8512-79f2fb4bd449\") " pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.874988 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3575355-a4fc-4f78-8512-79f2fb4bd449-catalog-content\") pod \"redhat-operators-wxd5j\" (UID: \"c3575355-a4fc-4f78-8512-79f2fb4bd449\") " pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:18 crc kubenswrapper[4931]: I1201 15:07:18.894366 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvnzq\" (UniqueName: \"kubernetes.io/projected/c3575355-a4fc-4f78-8512-79f2fb4bd449-kube-api-access-gvnzq\") pod \"redhat-operators-wxd5j\" (UID: \"c3575355-a4fc-4f78-8512-79f2fb4bd449\") " pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.006907 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.318836 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rpwj9"] Dec 01 15:07:19 crc kubenswrapper[4931]: W1201 15:07:19.328116 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83ac02de_c63c_4b14_bcbe_28e0ef9d91f9.slice/crio-979fd957e0a7a4a8ce9fbe514e9b35ecedc746805346c8d914ec79ecda6b1465 WatchSource:0}: Error finding container 979fd957e0a7a4a8ce9fbe514e9b35ecedc746805346c8d914ec79ecda6b1465: Status 404 returned error can't find the container with id 979fd957e0a7a4a8ce9fbe514e9b35ecedc746805346c8d914ec79ecda6b1465 Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.790791 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wxd5j"] Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.795626 4931 generic.go:334] "Generic (PLEG): container finished" podID="83ac02de-c63c-4b14-bcbe-28e0ef9d91f9" containerID="14067c7079198c2c35d785824782b13f957f3aeb502a2c1b3da27eea99416a21" exitCode=0 Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.795690 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpwj9" event={"ID":"83ac02de-c63c-4b14-bcbe-28e0ef9d91f9","Type":"ContainerDied","Data":"14067c7079198c2c35d785824782b13f957f3aeb502a2c1b3da27eea99416a21"} Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.795723 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpwj9" event={"ID":"83ac02de-c63c-4b14-bcbe-28e0ef9d91f9","Type":"ContainerStarted","Data":"979fd957e0a7a4a8ce9fbe514e9b35ecedc746805346c8d914ec79ecda6b1465"} Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.797547 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8422" event={"ID":"eda61d02-1a5b-4602-935e-04ded0ccddb5","Type":"ContainerStarted","Data":"4f7f4a9e58c79d6ec08aee20b5315de947921d810f1cc0f42c030d19c2a292e5"} Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.801552 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bxzms"] Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.802685 4931 generic.go:334] "Generic (PLEG): container finished" podID="b60c5b6d-f16e-4b68-a761-76678ada1930" containerID="65438ce0cd5008a72ebb9f02dfabbaa44767c6897985761af8801c5a6b05c549" exitCode=0 Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.803976 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5l6c" event={"ID":"b60c5b6d-f16e-4b68-a761-76678ada1930","Type":"ContainerDied","Data":"65438ce0cd5008a72ebb9f02dfabbaa44767c6897985761af8801c5a6b05c549"} Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.804069 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.827290 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bxzms"] Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.872177 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.872602 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.889292 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkmw5\" (UniqueName: \"kubernetes.io/projected/42ffe764-4b88-44c5-884b-bec879c16864-kube-api-access-hkmw5\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.889329 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42ffe764-4b88-44c5-884b-bec879c16864-bound-sa-token\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.889364 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42ffe764-4b88-44c5-884b-bec879c16864-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.889418 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42ffe764-4b88-44c5-884b-bec879c16864-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.889725 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.889904 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42ffe764-4b88-44c5-884b-bec879c16864-registry-tls\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.890082 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42ffe764-4b88-44c5-884b-bec879c16864-registry-certificates\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.890168 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42ffe764-4b88-44c5-884b-bec879c16864-trusted-ca\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.911267 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.991826 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42ffe764-4b88-44c5-884b-bec879c16864-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.991887 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42ffe764-4b88-44c5-884b-bec879c16864-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.991936 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42ffe764-4b88-44c5-884b-bec879c16864-registry-tls\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.991969 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42ffe764-4b88-44c5-884b-bec879c16864-registry-certificates\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.991991 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42ffe764-4b88-44c5-884b-bec879c16864-trusted-ca\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.992045 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkmw5\" (UniqueName: \"kubernetes.io/projected/42ffe764-4b88-44c5-884b-bec879c16864-kube-api-access-hkmw5\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.992068 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42ffe764-4b88-44c5-884b-bec879c16864-bound-sa-token\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.993277 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/42ffe764-4b88-44c5-884b-bec879c16864-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.994812 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/42ffe764-4b88-44c5-884b-bec879c16864-registry-certificates\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:19 crc kubenswrapper[4931]: I1201 15:07:19.995528 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42ffe764-4b88-44c5-884b-bec879c16864-trusted-ca\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:20 crc kubenswrapper[4931]: I1201 15:07:20.001338 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/42ffe764-4b88-44c5-884b-bec879c16864-registry-tls\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:20 crc kubenswrapper[4931]: I1201 15:07:20.002888 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/42ffe764-4b88-44c5-884b-bec879c16864-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:20 crc kubenswrapper[4931]: I1201 15:07:20.010487 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42ffe764-4b88-44c5-884b-bec879c16864-bound-sa-token\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:20 crc kubenswrapper[4931]: I1201 15:07:20.012154 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkmw5\" (UniqueName: \"kubernetes.io/projected/42ffe764-4b88-44c5-884b-bec879c16864-kube-api-access-hkmw5\") pod \"image-registry-66df7c8f76-bxzms\" (UID: \"42ffe764-4b88-44c5-884b-bec879c16864\") " pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:20 crc kubenswrapper[4931]: I1201 15:07:20.241529 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:20 crc kubenswrapper[4931]: I1201 15:07:20.710627 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bxzms"] Dec 01 15:07:20 crc kubenswrapper[4931]: I1201 15:07:20.818914 4931 generic.go:334] "Generic (PLEG): container finished" podID="eda61d02-1a5b-4602-935e-04ded0ccddb5" containerID="4f7f4a9e58c79d6ec08aee20b5315de947921d810f1cc0f42c030d19c2a292e5" exitCode=0 Dec 01 15:07:20 crc kubenswrapper[4931]: I1201 15:07:20.818977 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8422" event={"ID":"eda61d02-1a5b-4602-935e-04ded0ccddb5","Type":"ContainerDied","Data":"4f7f4a9e58c79d6ec08aee20b5315de947921d810f1cc0f42c030d19c2a292e5"} Dec 01 15:07:20 crc kubenswrapper[4931]: I1201 15:07:20.820961 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" event={"ID":"42ffe764-4b88-44c5-884b-bec879c16864","Type":"ContainerStarted","Data":"6b665806206a4d0d8597ae9762e51216d5fe7852d4657c47e2f30df944099c66"} Dec 01 15:07:20 crc kubenswrapper[4931]: I1201 15:07:20.824993 4931 generic.go:334] "Generic (PLEG): container finished" podID="c3575355-a4fc-4f78-8512-79f2fb4bd449" containerID="0cc5d3e1b3fe6a36d6cc91136c9199460adbdf0e334616c49be5e40b75487e2d" exitCode=0 Dec 01 15:07:20 crc kubenswrapper[4931]: I1201 15:07:20.825246 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxd5j" event={"ID":"c3575355-a4fc-4f78-8512-79f2fb4bd449","Type":"ContainerDied","Data":"0cc5d3e1b3fe6a36d6cc91136c9199460adbdf0e334616c49be5e40b75487e2d"} Dec 01 15:07:20 crc kubenswrapper[4931]: I1201 15:07:20.825308 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxd5j" event={"ID":"c3575355-a4fc-4f78-8512-79f2fb4bd449","Type":"ContainerStarted","Data":"b2914746783150d05d30d52fca96e1b6fb6a80f41200c1e051037cc1e87c17af"} Dec 01 15:07:21 crc kubenswrapper[4931]: I1201 15:07:21.837712 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5l6c" event={"ID":"b60c5b6d-f16e-4b68-a761-76678ada1930","Type":"ContainerStarted","Data":"fe81f5c6a579ef460753f5daea043a21b51a983c5a1871621eb6ca1cc252e68a"} Dec 01 15:07:21 crc kubenswrapper[4931]: I1201 15:07:21.840069 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" event={"ID":"42ffe764-4b88-44c5-884b-bec879c16864","Type":"ContainerStarted","Data":"8605b2c47b06e50f84a525d07599bc02bcbc4811a585d93945d888cf06308021"} Dec 01 15:07:21 crc kubenswrapper[4931]: I1201 15:07:21.840498 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:21 crc kubenswrapper[4931]: I1201 15:07:21.881928 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k5l6c" podStartSLOduration=2.800229725 podStartE2EDuration="5.881904863s" podCreationTimestamp="2025-12-01 15:07:16 +0000 UTC" firstStartedPulling="2025-12-01 15:07:17.788105302 +0000 UTC m=+384.213978969" lastFinishedPulling="2025-12-01 15:07:20.86978041 +0000 UTC m=+387.295654107" observedRunningTime="2025-12-01 15:07:21.859779419 +0000 UTC m=+388.285653096" watchObservedRunningTime="2025-12-01 15:07:21.881904863 +0000 UTC m=+388.307778530" Dec 01 15:07:22 crc kubenswrapper[4931]: I1201 15:07:22.853321 4931 generic.go:334] "Generic (PLEG): container finished" podID="83ac02de-c63c-4b14-bcbe-28e0ef9d91f9" containerID="53b78e749b7ecc6131d8252cf7c830f1f132b005bd6b88686d382abccd6885cb" exitCode=0 Dec 01 15:07:22 crc kubenswrapper[4931]: I1201 15:07:22.853436 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpwj9" event={"ID":"83ac02de-c63c-4b14-bcbe-28e0ef9d91f9","Type":"ContainerDied","Data":"53b78e749b7ecc6131d8252cf7c830f1f132b005bd6b88686d382abccd6885cb"} Dec 01 15:07:22 crc kubenswrapper[4931]: I1201 15:07:22.858051 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8422" event={"ID":"eda61d02-1a5b-4602-935e-04ded0ccddb5","Type":"ContainerStarted","Data":"a4b9956cb181a2830c32ab3f94e9466527da52c9120610868353789ffcedf07d"} Dec 01 15:07:22 crc kubenswrapper[4931]: I1201 15:07:22.861087 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxd5j" event={"ID":"c3575355-a4fc-4f78-8512-79f2fb4bd449","Type":"ContainerStarted","Data":"206cb841cfa096041275dfc232e2ab7f03df385fe677bbf0875a53a0ba4208a4"} Dec 01 15:07:22 crc kubenswrapper[4931]: I1201 15:07:22.897373 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" podStartSLOduration=3.8973510620000003 podStartE2EDuration="3.897351062s" podCreationTimestamp="2025-12-01 15:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:07:21.885112905 +0000 UTC m=+388.310986572" watchObservedRunningTime="2025-12-01 15:07:22.897351062 +0000 UTC m=+389.323224729" Dec 01 15:07:22 crc kubenswrapper[4931]: I1201 15:07:22.915290 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h8422" podStartSLOduration=2.874567618 podStartE2EDuration="6.915266456s" podCreationTimestamp="2025-12-01 15:07:16 +0000 UTC" firstStartedPulling="2025-12-01 15:07:17.78735669 +0000 UTC m=+384.213230357" lastFinishedPulling="2025-12-01 15:07:21.828055528 +0000 UTC m=+388.253929195" observedRunningTime="2025-12-01 15:07:22.911292292 +0000 UTC m=+389.337165959" watchObservedRunningTime="2025-12-01 15:07:22.915266456 +0000 UTC m=+389.341140123" Dec 01 15:07:23 crc kubenswrapper[4931]: I1201 15:07:23.868920 4931 generic.go:334] "Generic (PLEG): container finished" podID="c3575355-a4fc-4f78-8512-79f2fb4bd449" containerID="206cb841cfa096041275dfc232e2ab7f03df385fe677bbf0875a53a0ba4208a4" exitCode=0 Dec 01 15:07:23 crc kubenswrapper[4931]: I1201 15:07:23.869097 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxd5j" event={"ID":"c3575355-a4fc-4f78-8512-79f2fb4bd449","Type":"ContainerDied","Data":"206cb841cfa096041275dfc232e2ab7f03df385fe677bbf0875a53a0ba4208a4"} Dec 01 15:07:25 crc kubenswrapper[4931]: I1201 15:07:25.885371 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rpwj9" event={"ID":"83ac02de-c63c-4b14-bcbe-28e0ef9d91f9","Type":"ContainerStarted","Data":"7ecb3fb8a4fbd533f2708938aaf8a7b28789966026222caed2d4ee275b8233ef"} Dec 01 15:07:26 crc kubenswrapper[4931]: I1201 15:07:26.422077 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:26 crc kubenswrapper[4931]: I1201 15:07:26.423695 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:26 crc kubenswrapper[4931]: I1201 15:07:26.467365 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:26 crc kubenswrapper[4931]: I1201 15:07:26.492513 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rpwj9" podStartSLOduration=5.896380161 podStartE2EDuration="8.492491317s" podCreationTimestamp="2025-12-01 15:07:18 +0000 UTC" firstStartedPulling="2025-12-01 15:07:20.841959722 +0000 UTC m=+387.267833389" lastFinishedPulling="2025-12-01 15:07:23.438070878 +0000 UTC m=+389.863944545" observedRunningTime="2025-12-01 15:07:25.906480002 +0000 UTC m=+392.332353669" watchObservedRunningTime="2025-12-01 15:07:26.492491317 +0000 UTC m=+392.918364984" Dec 01 15:07:26 crc kubenswrapper[4931]: I1201 15:07:26.619697 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:26 crc kubenswrapper[4931]: I1201 15:07:26.620343 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:26 crc kubenswrapper[4931]: I1201 15:07:26.661913 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:26 crc kubenswrapper[4931]: I1201 15:07:26.895374 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxd5j" event={"ID":"c3575355-a4fc-4f78-8512-79f2fb4bd449","Type":"ContainerStarted","Data":"80346b57b3fb2801c5687bacc00f966c74cecf2ac83c8010e84b5b772a016a62"} Dec 01 15:07:26 crc kubenswrapper[4931]: I1201 15:07:26.922452 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wxd5j" podStartSLOduration=3.553592074 podStartE2EDuration="8.922412047s" podCreationTimestamp="2025-12-01 15:07:18 +0000 UTC" firstStartedPulling="2025-12-01 15:07:20.843760924 +0000 UTC m=+387.269634631" lastFinishedPulling="2025-12-01 15:07:26.212580937 +0000 UTC m=+392.638454604" observedRunningTime="2025-12-01 15:07:26.918444404 +0000 UTC m=+393.344318131" watchObservedRunningTime="2025-12-01 15:07:26.922412047 +0000 UTC m=+393.348285724" Dec 01 15:07:26 crc kubenswrapper[4931]: I1201 15:07:26.938895 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k5l6c" Dec 01 15:07:26 crc kubenswrapper[4931]: I1201 15:07:26.953550 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:07:28 crc kubenswrapper[4931]: I1201 15:07:28.872311 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:28 crc kubenswrapper[4931]: I1201 15:07:28.872829 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:28 crc kubenswrapper[4931]: I1201 15:07:28.939823 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:29 crc kubenswrapper[4931]: I1201 15:07:29.007322 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:29 crc kubenswrapper[4931]: I1201 15:07:29.007435 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:30 crc kubenswrapper[4931]: I1201 15:07:30.049638 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wxd5j" podUID="c3575355-a4fc-4f78-8512-79f2fb4bd449" containerName="registry-server" probeResult="failure" output=< Dec 01 15:07:30 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Dec 01 15:07:30 crc kubenswrapper[4931]: > Dec 01 15:07:38 crc kubenswrapper[4931]: I1201 15:07:38.929829 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rpwj9" Dec 01 15:07:39 crc kubenswrapper[4931]: I1201 15:07:39.066423 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:39 crc kubenswrapper[4931]: I1201 15:07:39.114928 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 15:07:40 crc kubenswrapper[4931]: I1201 15:07:40.255601 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-bxzms" Dec 01 15:07:40 crc kubenswrapper[4931]: I1201 15:07:40.348577 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gm6vp"] Dec 01 15:07:49 crc kubenswrapper[4931]: I1201 15:07:49.872992 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:07:49 crc kubenswrapper[4931]: I1201 15:07:49.874337 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:07:49 crc kubenswrapper[4931]: I1201 15:07:49.874503 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:07:49 crc kubenswrapper[4931]: I1201 15:07:49.876372 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f7b3bdd82a8534c77d3ea7e5ad5dd0c29ea4ba55c0d43ad658e7866a1c0c4265"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 15:07:49 crc kubenswrapper[4931]: I1201 15:07:49.876502 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://f7b3bdd82a8534c77d3ea7e5ad5dd0c29ea4ba55c0d43ad658e7866a1c0c4265" gracePeriod=600 Dec 01 15:07:50 crc kubenswrapper[4931]: I1201 15:07:50.074554 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="f7b3bdd82a8534c77d3ea7e5ad5dd0c29ea4ba55c0d43ad658e7866a1c0c4265" exitCode=0 Dec 01 15:07:50 crc kubenswrapper[4931]: I1201 15:07:50.074643 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"f7b3bdd82a8534c77d3ea7e5ad5dd0c29ea4ba55c0d43ad658e7866a1c0c4265"} Dec 01 15:07:50 crc kubenswrapper[4931]: I1201 15:07:50.074802 4931 scope.go:117] "RemoveContainer" containerID="080585a91cc4a7d8f5432b92e7babc7a67089bb082ce579eda6ce9e8f3cd01a6" Dec 01 15:07:51 crc kubenswrapper[4931]: I1201 15:07:51.085106 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"edf71cb1bce5d3aeba5997d04dcf4aa6526fc61ac3e3b1daeb2f121fedfeeabd"} Dec 01 15:08:05 crc kubenswrapper[4931]: I1201 15:08:05.397683 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" podUID="e2229b8c-268a-46fd-bb3d-442032e330ff" containerName="registry" containerID="cri-o://a9d07650f6bb51c3c56c74320d3ed85565e89c5fccc03b71f2157ce2bc98b2ad" gracePeriod=30 Dec 01 15:08:05 crc kubenswrapper[4931]: I1201 15:08:05.933485 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.056496 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2229b8c-268a-46fd-bb3d-442032e330ff-registry-certificates\") pod \"e2229b8c-268a-46fd-bb3d-442032e330ff\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.056591 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-bound-sa-token\") pod \"e2229b8c-268a-46fd-bb3d-442032e330ff\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.056711 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2229b8c-268a-46fd-bb3d-442032e330ff-installation-pull-secrets\") pod \"e2229b8c-268a-46fd-bb3d-442032e330ff\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.058220 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2229b8c-268a-46fd-bb3d-442032e330ff-trusted-ca\") pod \"e2229b8c-268a-46fd-bb3d-442032e330ff\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.058458 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-registry-tls\") pod \"e2229b8c-268a-46fd-bb3d-442032e330ff\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.058523 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sng4g\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-kube-api-access-sng4g\") pod \"e2229b8c-268a-46fd-bb3d-442032e330ff\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.058587 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2229b8c-268a-46fd-bb3d-442032e330ff-ca-trust-extracted\") pod \"e2229b8c-268a-46fd-bb3d-442032e330ff\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.058882 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e2229b8c-268a-46fd-bb3d-442032e330ff\" (UID: \"e2229b8c-268a-46fd-bb3d-442032e330ff\") " Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.059484 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2229b8c-268a-46fd-bb3d-442032e330ff-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e2229b8c-268a-46fd-bb3d-442032e330ff" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.061977 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2229b8c-268a-46fd-bb3d-442032e330ff-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e2229b8c-268a-46fd-bb3d-442032e330ff" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.065218 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-kube-api-access-sng4g" (OuterVolumeSpecName: "kube-api-access-sng4g") pod "e2229b8c-268a-46fd-bb3d-442032e330ff" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff"). InnerVolumeSpecName "kube-api-access-sng4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.066360 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2229b8c-268a-46fd-bb3d-442032e330ff-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e2229b8c-268a-46fd-bb3d-442032e330ff" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.066730 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e2229b8c-268a-46fd-bb3d-442032e330ff" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.068298 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e2229b8c-268a-46fd-bb3d-442032e330ff" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.072304 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e2229b8c-268a-46fd-bb3d-442032e330ff" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.080069 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2229b8c-268a-46fd-bb3d-442032e330ff-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e2229b8c-268a-46fd-bb3d-442032e330ff" (UID: "e2229b8c-268a-46fd-bb3d-442032e330ff"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.161714 4931 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e2229b8c-268a-46fd-bb3d-442032e330ff-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.161795 4931 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.161817 4931 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e2229b8c-268a-46fd-bb3d-442032e330ff-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.161841 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2229b8c-268a-46fd-bb3d-442032e330ff-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.161863 4931 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.161883 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sng4g\" (UniqueName: \"kubernetes.io/projected/e2229b8c-268a-46fd-bb3d-442032e330ff-kube-api-access-sng4g\") on node \"crc\" DevicePath \"\"" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.161903 4931 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e2229b8c-268a-46fd-bb3d-442032e330ff-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.197960 4931 generic.go:334] "Generic (PLEG): container finished" podID="e2229b8c-268a-46fd-bb3d-442032e330ff" containerID="a9d07650f6bb51c3c56c74320d3ed85565e89c5fccc03b71f2157ce2bc98b2ad" exitCode=0 Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.198018 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" event={"ID":"e2229b8c-268a-46fd-bb3d-442032e330ff","Type":"ContainerDied","Data":"a9d07650f6bb51c3c56c74320d3ed85565e89c5fccc03b71f2157ce2bc98b2ad"} Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.198074 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" event={"ID":"e2229b8c-268a-46fd-bb3d-442032e330ff","Type":"ContainerDied","Data":"163a48f10d5caf73a90fb599cce8a55e08bb0b8fdaa40c17a8b219dc7230da34"} Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.198115 4931 scope.go:117] "RemoveContainer" containerID="a9d07650f6bb51c3c56c74320d3ed85565e89c5fccc03b71f2157ce2bc98b2ad" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.198365 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gm6vp" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.232774 4931 scope.go:117] "RemoveContainer" containerID="a9d07650f6bb51c3c56c74320d3ed85565e89c5fccc03b71f2157ce2bc98b2ad" Dec 01 15:08:06 crc kubenswrapper[4931]: E1201 15:08:06.233450 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9d07650f6bb51c3c56c74320d3ed85565e89c5fccc03b71f2157ce2bc98b2ad\": container with ID starting with a9d07650f6bb51c3c56c74320d3ed85565e89c5fccc03b71f2157ce2bc98b2ad not found: ID does not exist" containerID="a9d07650f6bb51c3c56c74320d3ed85565e89c5fccc03b71f2157ce2bc98b2ad" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.233499 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9d07650f6bb51c3c56c74320d3ed85565e89c5fccc03b71f2157ce2bc98b2ad"} err="failed to get container status \"a9d07650f6bb51c3c56c74320d3ed85565e89c5fccc03b71f2157ce2bc98b2ad\": rpc error: code = NotFound desc = could not find container \"a9d07650f6bb51c3c56c74320d3ed85565e89c5fccc03b71f2157ce2bc98b2ad\": container with ID starting with a9d07650f6bb51c3c56c74320d3ed85565e89c5fccc03b71f2157ce2bc98b2ad not found: ID does not exist" Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.256316 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gm6vp"] Dec 01 15:08:06 crc kubenswrapper[4931]: I1201 15:08:06.258890 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gm6vp"] Dec 01 15:08:08 crc kubenswrapper[4931]: I1201 15:08:08.255718 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2229b8c-268a-46fd-bb3d-442032e330ff" path="/var/lib/kubelet/pods/e2229b8c-268a-46fd-bb3d-442032e330ff/volumes" Dec 01 15:10:19 crc kubenswrapper[4931]: I1201 15:10:19.872474 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:10:19 crc kubenswrapper[4931]: I1201 15:10:19.873142 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:10:49 crc kubenswrapper[4931]: I1201 15:10:49.872540 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:10:49 crc kubenswrapper[4931]: I1201 15:10:49.873594 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:11:19 crc kubenswrapper[4931]: I1201 15:11:19.871930 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:11:19 crc kubenswrapper[4931]: I1201 15:11:19.873039 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:11:19 crc kubenswrapper[4931]: I1201 15:11:19.873117 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:11:19 crc kubenswrapper[4931]: I1201 15:11:19.873889 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"edf71cb1bce5d3aeba5997d04dcf4aa6526fc61ac3e3b1daeb2f121fedfeeabd"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 15:11:19 crc kubenswrapper[4931]: I1201 15:11:19.874139 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://edf71cb1bce5d3aeba5997d04dcf4aa6526fc61ac3e3b1daeb2f121fedfeeabd" gracePeriod=600 Dec 01 15:11:20 crc kubenswrapper[4931]: I1201 15:11:20.641575 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="edf71cb1bce5d3aeba5997d04dcf4aa6526fc61ac3e3b1daeb2f121fedfeeabd" exitCode=0 Dec 01 15:11:20 crc kubenswrapper[4931]: I1201 15:11:20.641671 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"edf71cb1bce5d3aeba5997d04dcf4aa6526fc61ac3e3b1daeb2f121fedfeeabd"} Dec 01 15:11:20 crc kubenswrapper[4931]: I1201 15:11:20.642517 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"2c593bc454b5d325cbd0967c1c5d7f0f229621585e06f8319b965d66c0d93b5d"} Dec 01 15:11:20 crc kubenswrapper[4931]: I1201 15:11:20.642550 4931 scope.go:117] "RemoveContainer" containerID="f7b3bdd82a8534c77d3ea7e5ad5dd0c29ea4ba55c0d43ad658e7866a1c0c4265" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.389586 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-d8s9l"] Dec 01 15:12:37 crc kubenswrapper[4931]: E1201 15:12:37.390405 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2229b8c-268a-46fd-bb3d-442032e330ff" containerName="registry" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.390420 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2229b8c-268a-46fd-bb3d-442032e330ff" containerName="registry" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.390508 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2229b8c-268a-46fd-bb3d-442032e330ff" containerName="registry" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.390938 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-d8s9l" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.392967 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.393292 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.395043 4931 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zbvj7" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.416922 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lgwml"] Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.417771 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-lgwml" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.421875 4931 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bf2xs" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.432632 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-6jglv"] Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.433474 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jglv" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.435333 4931 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-t5g9z" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.440570 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lgwml"] Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.447783 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-6jglv"] Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.475102 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-d8s9l"] Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.571665 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddc8m\" (UniqueName: \"kubernetes.io/projected/75a5d47d-0c63-48dc-b439-8b69b82c29ed-kube-api-access-ddc8m\") pod \"cert-manager-cainjector-7f985d654d-d8s9l\" (UID: \"75a5d47d-0c63-48dc-b439-8b69b82c29ed\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-d8s9l" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.571798 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksrgm\" (UniqueName: \"kubernetes.io/projected/6057bc63-1367-4776-b9ab-2750c34f017d-kube-api-access-ksrgm\") pod \"cert-manager-5b446d88c5-lgwml\" (UID: \"6057bc63-1367-4776-b9ab-2750c34f017d\") " pod="cert-manager/cert-manager-5b446d88c5-lgwml" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.571832 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hld5h\" (UniqueName: \"kubernetes.io/projected/c0dce0ee-05c3-42a7-a599-5bff0b0416ee-kube-api-access-hld5h\") pod \"cert-manager-webhook-5655c58dd6-6jglv\" (UID: \"c0dce0ee-05c3-42a7-a599-5bff0b0416ee\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-6jglv" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.674142 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksrgm\" (UniqueName: \"kubernetes.io/projected/6057bc63-1367-4776-b9ab-2750c34f017d-kube-api-access-ksrgm\") pod \"cert-manager-5b446d88c5-lgwml\" (UID: \"6057bc63-1367-4776-b9ab-2750c34f017d\") " pod="cert-manager/cert-manager-5b446d88c5-lgwml" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.674214 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hld5h\" (UniqueName: \"kubernetes.io/projected/c0dce0ee-05c3-42a7-a599-5bff0b0416ee-kube-api-access-hld5h\") pod \"cert-manager-webhook-5655c58dd6-6jglv\" (UID: \"c0dce0ee-05c3-42a7-a599-5bff0b0416ee\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-6jglv" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.674298 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddc8m\" (UniqueName: \"kubernetes.io/projected/75a5d47d-0c63-48dc-b439-8b69b82c29ed-kube-api-access-ddc8m\") pod \"cert-manager-cainjector-7f985d654d-d8s9l\" (UID: \"75a5d47d-0c63-48dc-b439-8b69b82c29ed\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-d8s9l" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.696791 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddc8m\" (UniqueName: \"kubernetes.io/projected/75a5d47d-0c63-48dc-b439-8b69b82c29ed-kube-api-access-ddc8m\") pod \"cert-manager-cainjector-7f985d654d-d8s9l\" (UID: \"75a5d47d-0c63-48dc-b439-8b69b82c29ed\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-d8s9l" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.697369 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hld5h\" (UniqueName: \"kubernetes.io/projected/c0dce0ee-05c3-42a7-a599-5bff0b0416ee-kube-api-access-hld5h\") pod \"cert-manager-webhook-5655c58dd6-6jglv\" (UID: \"c0dce0ee-05c3-42a7-a599-5bff0b0416ee\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-6jglv" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.697426 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksrgm\" (UniqueName: \"kubernetes.io/projected/6057bc63-1367-4776-b9ab-2750c34f017d-kube-api-access-ksrgm\") pod \"cert-manager-5b446d88c5-lgwml\" (UID: \"6057bc63-1367-4776-b9ab-2750c34f017d\") " pod="cert-manager/cert-manager-5b446d88c5-lgwml" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.715834 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-d8s9l" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.732932 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-lgwml" Dec 01 15:12:37 crc kubenswrapper[4931]: I1201 15:12:37.751798 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jglv" Dec 01 15:12:38 crc kubenswrapper[4931]: I1201 15:12:38.050236 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-6jglv"] Dec 01 15:12:38 crc kubenswrapper[4931]: I1201 15:12:38.061314 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 15:12:38 crc kubenswrapper[4931]: I1201 15:12:38.156902 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-d8s9l"] Dec 01 15:12:38 crc kubenswrapper[4931]: I1201 15:12:38.209947 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jglv" event={"ID":"c0dce0ee-05c3-42a7-a599-5bff0b0416ee","Type":"ContainerStarted","Data":"6717a1363efc8bf19e1ed825ba2a27c20f0eb39aafb01d2390da0285f5b071ba"} Dec 01 15:12:38 crc kubenswrapper[4931]: I1201 15:12:38.212875 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-d8s9l" event={"ID":"75a5d47d-0c63-48dc-b439-8b69b82c29ed","Type":"ContainerStarted","Data":"29b0731fdb21b8700e66e77f55389117f23d366e597bccbf141b3627ddcb588c"} Dec 01 15:12:38 crc kubenswrapper[4931]: I1201 15:12:38.216620 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-lgwml"] Dec 01 15:12:38 crc kubenswrapper[4931]: W1201 15:12:38.220735 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6057bc63_1367_4776_b9ab_2750c34f017d.slice/crio-8bc4fde601f36a7496cabe9417ae82c8cc0b03532f7313c7655d402b7fc570a9 WatchSource:0}: Error finding container 8bc4fde601f36a7496cabe9417ae82c8cc0b03532f7313c7655d402b7fc570a9: Status 404 returned error can't find the container with id 8bc4fde601f36a7496cabe9417ae82c8cc0b03532f7313c7655d402b7fc570a9 Dec 01 15:12:39 crc kubenswrapper[4931]: I1201 15:12:39.221259 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-lgwml" event={"ID":"6057bc63-1367-4776-b9ab-2750c34f017d","Type":"ContainerStarted","Data":"8bc4fde601f36a7496cabe9417ae82c8cc0b03532f7313c7655d402b7fc570a9"} Dec 01 15:12:40 crc kubenswrapper[4931]: I1201 15:12:40.232526 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jglv" event={"ID":"c0dce0ee-05c3-42a7-a599-5bff0b0416ee","Type":"ContainerStarted","Data":"3965cc617918f2d2b64848180e3ff01a41b8d7e4087cc833304714f77c006cc5"} Dec 01 15:12:40 crc kubenswrapper[4931]: I1201 15:12:40.233025 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jglv" Dec 01 15:12:40 crc kubenswrapper[4931]: I1201 15:12:40.254328 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jglv" podStartSLOduration=1.2960553080000001 podStartE2EDuration="3.254297142s" podCreationTimestamp="2025-12-01 15:12:37 +0000 UTC" firstStartedPulling="2025-12-01 15:12:38.061094375 +0000 UTC m=+704.486968042" lastFinishedPulling="2025-12-01 15:12:40.019336209 +0000 UTC m=+706.445209876" observedRunningTime="2025-12-01 15:12:40.24819312 +0000 UTC m=+706.674066787" watchObservedRunningTime="2025-12-01 15:12:40.254297142 +0000 UTC m=+706.680170819" Dec 01 15:12:42 crc kubenswrapper[4931]: I1201 15:12:42.251143 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-d8s9l" event={"ID":"75a5d47d-0c63-48dc-b439-8b69b82c29ed","Type":"ContainerStarted","Data":"14b8818a2d3eb50ab1ae9935168114e41a1d96fdf14d01da1df827119ac5d099"} Dec 01 15:12:42 crc kubenswrapper[4931]: I1201 15:12:42.252292 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-lgwml" event={"ID":"6057bc63-1367-4776-b9ab-2750c34f017d","Type":"ContainerStarted","Data":"46b8dcbe07be700ea7beb87f3ccebc8eccdde2851c22504a1a41f7dc878df4d8"} Dec 01 15:12:42 crc kubenswrapper[4931]: I1201 15:12:42.287354 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-lgwml" podStartSLOduration=2.210404479 podStartE2EDuration="5.287320854s" podCreationTimestamp="2025-12-01 15:12:37 +0000 UTC" firstStartedPulling="2025-12-01 15:12:38.223245998 +0000 UTC m=+704.649119695" lastFinishedPulling="2025-12-01 15:12:41.300162403 +0000 UTC m=+707.726036070" observedRunningTime="2025-12-01 15:12:42.285102068 +0000 UTC m=+708.710975775" watchObservedRunningTime="2025-12-01 15:12:42.287320854 +0000 UTC m=+708.713194561" Dec 01 15:12:42 crc kubenswrapper[4931]: I1201 15:12:42.288566 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-d8s9l" podStartSLOduration=2.157559915 podStartE2EDuration="5.288550361s" podCreationTimestamp="2025-12-01 15:12:37 +0000 UTC" firstStartedPulling="2025-12-01 15:12:38.164413215 +0000 UTC m=+704.590286882" lastFinishedPulling="2025-12-01 15:12:41.295403641 +0000 UTC m=+707.721277328" observedRunningTime="2025-12-01 15:12:42.268168073 +0000 UTC m=+708.694041740" watchObservedRunningTime="2025-12-01 15:12:42.288550361 +0000 UTC m=+708.714424068" Dec 01 15:12:47 crc kubenswrapper[4931]: I1201 15:12:47.755450 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-6jglv" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.322348 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v5g28"] Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.323026 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovn-controller" containerID="cri-o://8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be" gracePeriod=30 Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.323151 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="kube-rbac-proxy-node" containerID="cri-o://b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176" gracePeriod=30 Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.323212 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovn-acl-logging" containerID="cri-o://508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a" gracePeriod=30 Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.323429 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="sbdb" containerID="cri-o://a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73" gracePeriod=30 Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.323152 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="northd" containerID="cri-o://c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95" gracePeriod=30 Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.323709 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="nbdb" containerID="cri-o://2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c" gracePeriod=30 Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.323675 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16" gracePeriod=30 Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.375151 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" containerID="cri-o://9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872" gracePeriod=30 Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.697333 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/3.log" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.700728 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovn-acl-logging/0.log" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.701676 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovn-controller/0.log" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.702243 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.786825 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nrtsj"] Dec 01 15:12:48 crc kubenswrapper[4931]: E1201 15:12:48.787153 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787175 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: E1201 15:12:48.787194 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787205 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: E1201 15:12:48.787225 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="sbdb" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787235 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="sbdb" Dec 01 15:12:48 crc kubenswrapper[4931]: E1201 15:12:48.787248 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovn-acl-logging" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787258 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovn-acl-logging" Dec 01 15:12:48 crc kubenswrapper[4931]: E1201 15:12:48.787269 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787278 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 15:12:48 crc kubenswrapper[4931]: E1201 15:12:48.787300 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovn-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787310 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovn-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: E1201 15:12:48.787327 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="northd" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787337 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="northd" Dec 01 15:12:48 crc kubenswrapper[4931]: E1201 15:12:48.787357 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="kube-rbac-proxy-node" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787367 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="kube-rbac-proxy-node" Dec 01 15:12:48 crc kubenswrapper[4931]: E1201 15:12:48.787455 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787469 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: E1201 15:12:48.787482 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="nbdb" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787491 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="nbdb" Dec 01 15:12:48 crc kubenswrapper[4931]: E1201 15:12:48.787504 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="kubecfg-setup" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787514 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="kubecfg-setup" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787683 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="northd" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787702 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovn-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787717 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787732 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787745 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="kube-rbac-proxy-node" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787759 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787769 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787781 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovn-acl-logging" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787791 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="nbdb" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787803 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="sbdb" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787813 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: E1201 15:12:48.787974 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.787987 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: E1201 15:12:48.788005 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.788015 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.788174 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerName="ovnkube-controller" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.791197 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.858979 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-node-log\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859058 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-run-ovn-kubernetes\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859101 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-run-netns\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859122 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-slash\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859149 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-var-lib-openvswitch\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859181 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovn-node-metrics-cert\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859207 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-env-overrides\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859192 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-node-log" (OuterVolumeSpecName: "node-log") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859240 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859271 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-kubelet\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859276 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-slash" (OuterVolumeSpecName: "host-slash") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859293 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-log-socket\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859308 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859321 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-systemd-units\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859338 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859343 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-cni-netd\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859366 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859380 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-ovn\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859421 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859433 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-etc-openvswitch\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859453 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-systemd\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859479 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-openvswitch\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859510 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovnkube-script-lib\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859537 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-cni-bin\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859567 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b56b\" (UniqueName: \"kubernetes.io/projected/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-kube-api-access-8b56b\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859592 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovnkube-config\") pod \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\" (UID: \"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a\") " Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859875 4931 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859887 4931 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-slash\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859895 4931 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859906 4931 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859915 4931 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-node-log\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.859924 4931 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.860380 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.860440 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.860462 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-log-socket" (OuterVolumeSpecName: "log-socket") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.860479 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.860498 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.860533 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.860595 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.860698 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.860991 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.861032 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.861106 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.867633 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.868092 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-kube-api-access-8b56b" (OuterVolumeSpecName: "kube-api-access-8b56b") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "kube-api-access-8b56b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.881771 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" (UID: "16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.961437 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpkrx\" (UniqueName: \"kubernetes.io/projected/012a89b1-3f84-4485-a6ad-fc33168d8726-kube-api-access-zpkrx\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.961522 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-log-socket\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.961619 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/012a89b1-3f84-4485-a6ad-fc33168d8726-ovn-node-metrics-cert\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.961742 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/012a89b1-3f84-4485-a6ad-fc33168d8726-ovnkube-config\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.961779 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-run-openvswitch\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.961917 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-node-log\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.961942 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/012a89b1-3f84-4485-a6ad-fc33168d8726-env-overrides\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.961972 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962099 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-run-ovn-kubernetes\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962148 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-run-ovn\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962175 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/012a89b1-3f84-4485-a6ad-fc33168d8726-ovnkube-script-lib\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962201 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-kubelet\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962228 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-run-netns\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962258 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-systemd-units\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962290 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-var-lib-openvswitch\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962311 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-etc-openvswitch\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962378 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-slash\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962427 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-cni-bin\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962461 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-cni-netd\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962485 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-run-systemd\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962543 4931 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962559 4931 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962572 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b56b\" (UniqueName: \"kubernetes.io/projected/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-kube-api-access-8b56b\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962586 4931 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962600 4931 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962611 4931 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962624 4931 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962636 4931 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-log-socket\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962648 4931 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962661 4931 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962781 4931 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962834 4931 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962850 4931 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:48 crc kubenswrapper[4931]: I1201 15:12:48.962864 4931 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064331 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-log-socket\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064419 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/012a89b1-3f84-4485-a6ad-fc33168d8726-ovn-node-metrics-cert\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064450 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/012a89b1-3f84-4485-a6ad-fc33168d8726-ovnkube-config\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064481 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-run-openvswitch\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064523 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-node-log\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064516 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-log-socket\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064545 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/012a89b1-3f84-4485-a6ad-fc33168d8726-env-overrides\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064654 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064661 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-run-openvswitch\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064736 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064748 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-run-ovn-kubernetes\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064807 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-node-log\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064871 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-run-ovn\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064883 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-run-ovn-kubernetes\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064849 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-run-ovn\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.064985 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/012a89b1-3f84-4485-a6ad-fc33168d8726-ovnkube-script-lib\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065025 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-kubelet\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065048 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-run-netns\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065076 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-systemd-units\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065115 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-var-lib-openvswitch\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065134 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-etc-openvswitch\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065154 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-kubelet\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065167 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-slash\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065202 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-systemd-units\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065188 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-slash\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065210 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-etc-openvswitch\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065184 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-run-netns\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065259 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-cni-bin\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065295 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-cni-bin\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065315 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-var-lib-openvswitch\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065415 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-cni-netd\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065437 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/012a89b1-3f84-4485-a6ad-fc33168d8726-env-overrides\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065467 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-run-systemd\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065478 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-host-cni-netd\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065537 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpkrx\" (UniqueName: \"kubernetes.io/projected/012a89b1-3f84-4485-a6ad-fc33168d8726-kube-api-access-zpkrx\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065557 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/012a89b1-3f84-4485-a6ad-fc33168d8726-run-systemd\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065920 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/012a89b1-3f84-4485-a6ad-fc33168d8726-ovnkube-config\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.065981 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/012a89b1-3f84-4485-a6ad-fc33168d8726-ovnkube-script-lib\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.070048 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/012a89b1-3f84-4485-a6ad-fc33168d8726-ovn-node-metrics-cert\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.088368 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpkrx\" (UniqueName: \"kubernetes.io/projected/012a89b1-3f84-4485-a6ad-fc33168d8726-kube-api-access-zpkrx\") pod \"ovnkube-node-nrtsj\" (UID: \"012a89b1-3f84-4485-a6ad-fc33168d8726\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.118054 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.311195 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovnkube-controller/3.log" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.314855 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovn-acl-logging/0.log" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.315446 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v5g28_16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/ovn-controller/0.log" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316189 4931 generic.go:334] "Generic (PLEG): container finished" podID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerID="9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872" exitCode=0 Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316218 4931 generic.go:334] "Generic (PLEG): container finished" podID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerID="a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73" exitCode=0 Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316228 4931 generic.go:334] "Generic (PLEG): container finished" podID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerID="2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c" exitCode=0 Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316228 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316280 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316295 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316305 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316329 4931 scope.go:117] "RemoveContainer" containerID="9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316348 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316243 4931 generic.go:334] "Generic (PLEG): container finished" podID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerID="c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95" exitCode=0 Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316458 4931 generic.go:334] "Generic (PLEG): container finished" podID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerID="1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16" exitCode=0 Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316472 4931 generic.go:334] "Generic (PLEG): container finished" podID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerID="b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176" exitCode=0 Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316480 4931 generic.go:334] "Generic (PLEG): container finished" podID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerID="508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a" exitCode=143 Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316488 4931 generic.go:334] "Generic (PLEG): container finished" podID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" containerID="8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be" exitCode=143 Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316521 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316824 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316845 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316880 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316889 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316894 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316900 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316905 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316910 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316916 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316921 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316931 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316963 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316971 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316976 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316981 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316986 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316991 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.316999 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317003 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317008 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317056 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317065 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317075 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317083 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317088 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317094 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317099 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317104 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317110 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317125 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317132 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317138 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317145 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v5g28" event={"ID":"16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a","Type":"ContainerDied","Data":"2801ef2d3595c72cae9f7b23b6de9dcce0d98feecb7e8081c177bc10479a51c8"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317153 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317160 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317166 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317183 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317189 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317195 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317201 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317206 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317212 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.317218 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.319883 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6nwqj_db092a9c-f0f2-401d-82dd-b3af535585cc/kube-multus/2.log" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.320566 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6nwqj_db092a9c-f0f2-401d-82dd-b3af535585cc/kube-multus/1.log" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.320605 4931 generic.go:334] "Generic (PLEG): container finished" podID="db092a9c-f0f2-401d-82dd-b3af535585cc" containerID="58e0cafadf10e2f6c28ad954b6ef10668446085bb039d922999d395643c4d133" exitCode=2 Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.320661 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6nwqj" event={"ID":"db092a9c-f0f2-401d-82dd-b3af535585cc","Type":"ContainerDied","Data":"58e0cafadf10e2f6c28ad954b6ef10668446085bb039d922999d395643c4d133"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.320680 4931 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.321234 4931 scope.go:117] "RemoveContainer" containerID="58e0cafadf10e2f6c28ad954b6ef10668446085bb039d922999d395643c4d133" Dec 01 15:12:49 crc kubenswrapper[4931]: E1201 15:12:49.321465 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6nwqj_openshift-multus(db092a9c-f0f2-401d-82dd-b3af535585cc)\"" pod="openshift-multus/multus-6nwqj" podUID="db092a9c-f0f2-401d-82dd-b3af535585cc" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.322436 4931 generic.go:334] "Generic (PLEG): container finished" podID="012a89b1-3f84-4485-a6ad-fc33168d8726" containerID="5e987626d12800051138f8b950aa29e3c861246e8a7f24739a634f92b5109e72" exitCode=0 Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.322502 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" event={"ID":"012a89b1-3f84-4485-a6ad-fc33168d8726","Type":"ContainerDied","Data":"5e987626d12800051138f8b950aa29e3c861246e8a7f24739a634f92b5109e72"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.322557 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" event={"ID":"012a89b1-3f84-4485-a6ad-fc33168d8726","Type":"ContainerStarted","Data":"7048fda13357d914f3afec7c2ef8d2ae49399b93dbf14a9362c5a6625adc0973"} Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.368178 4931 scope.go:117] "RemoveContainer" containerID="4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.399405 4931 scope.go:117] "RemoveContainer" containerID="a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.408059 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v5g28"] Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.410850 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v5g28"] Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.430571 4931 scope.go:117] "RemoveContainer" containerID="2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.459753 4931 scope.go:117] "RemoveContainer" containerID="c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.508164 4931 scope.go:117] "RemoveContainer" containerID="1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.525007 4931 scope.go:117] "RemoveContainer" containerID="b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.548660 4931 scope.go:117] "RemoveContainer" containerID="508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.580715 4931 scope.go:117] "RemoveContainer" containerID="8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.614036 4931 scope.go:117] "RemoveContainer" containerID="9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.641570 4931 scope.go:117] "RemoveContainer" containerID="9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872" Dec 01 15:12:49 crc kubenswrapper[4931]: E1201 15:12:49.642152 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872\": container with ID starting with 9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872 not found: ID does not exist" containerID="9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.642188 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872"} err="failed to get container status \"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872\": rpc error: code = NotFound desc = could not find container \"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872\": container with ID starting with 9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.642210 4931 scope.go:117] "RemoveContainer" containerID="4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351" Dec 01 15:12:49 crc kubenswrapper[4931]: E1201 15:12:49.642939 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\": container with ID starting with 4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351 not found: ID does not exist" containerID="4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.642990 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351"} err="failed to get container status \"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\": rpc error: code = NotFound desc = could not find container \"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\": container with ID starting with 4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.643023 4931 scope.go:117] "RemoveContainer" containerID="a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73" Dec 01 15:12:49 crc kubenswrapper[4931]: E1201 15:12:49.643973 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\": container with ID starting with a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73 not found: ID does not exist" containerID="a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.644008 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73"} err="failed to get container status \"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\": rpc error: code = NotFound desc = could not find container \"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\": container with ID starting with a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.644026 4931 scope.go:117] "RemoveContainer" containerID="2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c" Dec 01 15:12:49 crc kubenswrapper[4931]: E1201 15:12:49.644950 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\": container with ID starting with 2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c not found: ID does not exist" containerID="2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.644975 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c"} err="failed to get container status \"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\": rpc error: code = NotFound desc = could not find container \"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\": container with ID starting with 2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.644992 4931 scope.go:117] "RemoveContainer" containerID="c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95" Dec 01 15:12:49 crc kubenswrapper[4931]: E1201 15:12:49.645357 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\": container with ID starting with c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95 not found: ID does not exist" containerID="c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.645443 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95"} err="failed to get container status \"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\": rpc error: code = NotFound desc = could not find container \"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\": container with ID starting with c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.645461 4931 scope.go:117] "RemoveContainer" containerID="1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16" Dec 01 15:12:49 crc kubenswrapper[4931]: E1201 15:12:49.645772 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\": container with ID starting with 1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16 not found: ID does not exist" containerID="1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.645799 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16"} err="failed to get container status \"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\": rpc error: code = NotFound desc = could not find container \"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\": container with ID starting with 1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.645816 4931 scope.go:117] "RemoveContainer" containerID="b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176" Dec 01 15:12:49 crc kubenswrapper[4931]: E1201 15:12:49.646488 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\": container with ID starting with b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176 not found: ID does not exist" containerID="b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.646536 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176"} err="failed to get container status \"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\": rpc error: code = NotFound desc = could not find container \"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\": container with ID starting with b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.646782 4931 scope.go:117] "RemoveContainer" containerID="508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a" Dec 01 15:12:49 crc kubenswrapper[4931]: E1201 15:12:49.647163 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\": container with ID starting with 508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a not found: ID does not exist" containerID="508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.647184 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a"} err="failed to get container status \"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\": rpc error: code = NotFound desc = could not find container \"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\": container with ID starting with 508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.647198 4931 scope.go:117] "RemoveContainer" containerID="8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be" Dec 01 15:12:49 crc kubenswrapper[4931]: E1201 15:12:49.647527 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\": container with ID starting with 8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be not found: ID does not exist" containerID="8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.647569 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be"} err="failed to get container status \"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\": rpc error: code = NotFound desc = could not find container \"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\": container with ID starting with 8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.647585 4931 scope.go:117] "RemoveContainer" containerID="9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191" Dec 01 15:12:49 crc kubenswrapper[4931]: E1201 15:12:49.648136 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\": container with ID starting with 9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191 not found: ID does not exist" containerID="9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.648168 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191"} err="failed to get container status \"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\": rpc error: code = NotFound desc = could not find container \"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\": container with ID starting with 9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.648182 4931 scope.go:117] "RemoveContainer" containerID="9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.649080 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872"} err="failed to get container status \"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872\": rpc error: code = NotFound desc = could not find container \"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872\": container with ID starting with 9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.649106 4931 scope.go:117] "RemoveContainer" containerID="4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.649446 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351"} err="failed to get container status \"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\": rpc error: code = NotFound desc = could not find container \"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\": container with ID starting with 4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.649475 4931 scope.go:117] "RemoveContainer" containerID="a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.649739 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73"} err="failed to get container status \"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\": rpc error: code = NotFound desc = could not find container \"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\": container with ID starting with a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.649756 4931 scope.go:117] "RemoveContainer" containerID="2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.650006 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c"} err="failed to get container status \"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\": rpc error: code = NotFound desc = could not find container \"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\": container with ID starting with 2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.650032 4931 scope.go:117] "RemoveContainer" containerID="c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.650343 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95"} err="failed to get container status \"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\": rpc error: code = NotFound desc = could not find container \"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\": container with ID starting with c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.650371 4931 scope.go:117] "RemoveContainer" containerID="1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.650886 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16"} err="failed to get container status \"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\": rpc error: code = NotFound desc = could not find container \"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\": container with ID starting with 1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.650908 4931 scope.go:117] "RemoveContainer" containerID="b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.651203 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176"} err="failed to get container status \"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\": rpc error: code = NotFound desc = could not find container \"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\": container with ID starting with b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.651228 4931 scope.go:117] "RemoveContainer" containerID="508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.651886 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a"} err="failed to get container status \"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\": rpc error: code = NotFound desc = could not find container \"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\": container with ID starting with 508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.651904 4931 scope.go:117] "RemoveContainer" containerID="8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.652171 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be"} err="failed to get container status \"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\": rpc error: code = NotFound desc = could not find container \"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\": container with ID starting with 8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.652191 4931 scope.go:117] "RemoveContainer" containerID="9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.653821 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191"} err="failed to get container status \"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\": rpc error: code = NotFound desc = could not find container \"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\": container with ID starting with 9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.653843 4931 scope.go:117] "RemoveContainer" containerID="9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.654226 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872"} err="failed to get container status \"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872\": rpc error: code = NotFound desc = could not find container \"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872\": container with ID starting with 9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.654255 4931 scope.go:117] "RemoveContainer" containerID="4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.654751 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351"} err="failed to get container status \"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\": rpc error: code = NotFound desc = could not find container \"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\": container with ID starting with 4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.654771 4931 scope.go:117] "RemoveContainer" containerID="a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.655082 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73"} err="failed to get container status \"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\": rpc error: code = NotFound desc = could not find container \"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\": container with ID starting with a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.655125 4931 scope.go:117] "RemoveContainer" containerID="2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.655506 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c"} err="failed to get container status \"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\": rpc error: code = NotFound desc = could not find container \"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\": container with ID starting with 2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.655534 4931 scope.go:117] "RemoveContainer" containerID="c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.656057 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95"} err="failed to get container status \"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\": rpc error: code = NotFound desc = could not find container \"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\": container with ID starting with c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.656097 4931 scope.go:117] "RemoveContainer" containerID="1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.656460 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16"} err="failed to get container status \"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\": rpc error: code = NotFound desc = could not find container \"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\": container with ID starting with 1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.656497 4931 scope.go:117] "RemoveContainer" containerID="b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.657582 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176"} err="failed to get container status \"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\": rpc error: code = NotFound desc = could not find container \"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\": container with ID starting with b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.657624 4931 scope.go:117] "RemoveContainer" containerID="508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.658506 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a"} err="failed to get container status \"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\": rpc error: code = NotFound desc = could not find container \"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\": container with ID starting with 508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.658991 4931 scope.go:117] "RemoveContainer" containerID="8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.659925 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be"} err="failed to get container status \"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\": rpc error: code = NotFound desc = could not find container \"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\": container with ID starting with 8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.659952 4931 scope.go:117] "RemoveContainer" containerID="9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.660322 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191"} err="failed to get container status \"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\": rpc error: code = NotFound desc = could not find container \"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\": container with ID starting with 9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.660355 4931 scope.go:117] "RemoveContainer" containerID="9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.660710 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872"} err="failed to get container status \"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872\": rpc error: code = NotFound desc = could not find container \"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872\": container with ID starting with 9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.660734 4931 scope.go:117] "RemoveContainer" containerID="4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.661046 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351"} err="failed to get container status \"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\": rpc error: code = NotFound desc = could not find container \"4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351\": container with ID starting with 4f91e5bed85fba103af2d6c977a186ebcad198305801d6d9bd26486087097351 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.661085 4931 scope.go:117] "RemoveContainer" containerID="a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.661425 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73"} err="failed to get container status \"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\": rpc error: code = NotFound desc = could not find container \"a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73\": container with ID starting with a109a54e2de31dd51de63e0573fbc889dd4b2ca8a62280b24bbbe8197d409f73 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.661451 4931 scope.go:117] "RemoveContainer" containerID="2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.661761 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c"} err="failed to get container status \"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\": rpc error: code = NotFound desc = could not find container \"2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c\": container with ID starting with 2cb7dff22d128fce23688f05ab43d2259c38e2a74e57b3f0a9f7e65939ccab9c not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.661783 4931 scope.go:117] "RemoveContainer" containerID="c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.662056 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95"} err="failed to get container status \"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\": rpc error: code = NotFound desc = could not find container \"c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95\": container with ID starting with c3d49c0e8eaee9e87c51f56b9d70997d7f9d35e86343e4ba7c5b8c642892ff95 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.662098 4931 scope.go:117] "RemoveContainer" containerID="1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.662354 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16"} err="failed to get container status \"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\": rpc error: code = NotFound desc = could not find container \"1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16\": container with ID starting with 1c980f4017b101052b6d9c5093fa888100fe3e707053f8979897038a5abeaa16 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.662405 4931 scope.go:117] "RemoveContainer" containerID="b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.662859 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176"} err="failed to get container status \"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\": rpc error: code = NotFound desc = could not find container \"b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176\": container with ID starting with b56d9b1ab6b5834ce18b2ebfc5122feac1319121c4dd152d491d32fb1f677176 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.662899 4931 scope.go:117] "RemoveContainer" containerID="508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.663204 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a"} err="failed to get container status \"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\": rpc error: code = NotFound desc = could not find container \"508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a\": container with ID starting with 508267cf99cba794d7eae9dd7939eead239815b928815ff376252f0864c8473a not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.663241 4931 scope.go:117] "RemoveContainer" containerID="8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.663532 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be"} err="failed to get container status \"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\": rpc error: code = NotFound desc = could not find container \"8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be\": container with ID starting with 8325c403eb11cfd6bbdabca4f3ebad561968be61284da33efb1b2c2c579d34be not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.663570 4931 scope.go:117] "RemoveContainer" containerID="9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.663840 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191"} err="failed to get container status \"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\": rpc error: code = NotFound desc = could not find container \"9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191\": container with ID starting with 9ab9831ef2daf0fd00d09ea9ceff725bb2251490787a8f5ef7eda77f817ff191 not found: ID does not exist" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.663861 4931 scope.go:117] "RemoveContainer" containerID="9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872" Dec 01 15:12:49 crc kubenswrapper[4931]: I1201 15:12:49.664296 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872"} err="failed to get container status \"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872\": rpc error: code = NotFound desc = could not find container \"9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872\": container with ID starting with 9af47f605b9af3c2778ef0b543e515b2e158dcc3a624df772c5f891904559872 not found: ID does not exist" Dec 01 15:12:50 crc kubenswrapper[4931]: I1201 15:12:50.249670 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a" path="/var/lib/kubelet/pods/16e4fd4a-b253-4b2f-8f42-ddbfc4dd8f5a/volumes" Dec 01 15:12:50 crc kubenswrapper[4931]: I1201 15:12:50.333750 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" event={"ID":"012a89b1-3f84-4485-a6ad-fc33168d8726","Type":"ContainerStarted","Data":"ff839732cf2dee83e595f8fbd81fb452147986aefae8d54a57b09ea9b1f4be3e"} Dec 01 15:12:50 crc kubenswrapper[4931]: I1201 15:12:50.333802 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" event={"ID":"012a89b1-3f84-4485-a6ad-fc33168d8726","Type":"ContainerStarted","Data":"532f3fbb483aaf180060a2364ad2eba1737363f3c3d6f552b9b4fe65731d8332"} Dec 01 15:12:50 crc kubenswrapper[4931]: I1201 15:12:50.333815 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" event={"ID":"012a89b1-3f84-4485-a6ad-fc33168d8726","Type":"ContainerStarted","Data":"169988f32b348e465023ee9118496973e5652f4b730fe3f204a34fe6cc399f80"} Dec 01 15:12:50 crc kubenswrapper[4931]: I1201 15:12:50.333829 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" event={"ID":"012a89b1-3f84-4485-a6ad-fc33168d8726","Type":"ContainerStarted","Data":"ed36ca8eac43be0e45e5f8543214932b2d15a2587ac32366e9d0a205fde3e750"} Dec 01 15:12:50 crc kubenswrapper[4931]: I1201 15:12:50.333838 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" event={"ID":"012a89b1-3f84-4485-a6ad-fc33168d8726","Type":"ContainerStarted","Data":"82ca4c4e8e3ba5a52866c4c6a9c7716aff033e504f1c318b2aa3a979ca2d39cd"} Dec 01 15:12:50 crc kubenswrapper[4931]: I1201 15:12:50.333849 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" event={"ID":"012a89b1-3f84-4485-a6ad-fc33168d8726","Type":"ContainerStarted","Data":"c2d212f1c9db5a4ac9e2ec1a519f279661a03ccdd1e01d708d9b3d159c387814"} Dec 01 15:12:53 crc kubenswrapper[4931]: I1201 15:12:53.382772 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" event={"ID":"012a89b1-3f84-4485-a6ad-fc33168d8726","Type":"ContainerStarted","Data":"61425fc91cb6069e067e81c32c8c0656e645dfd47b28faa20757a045e7afeb7c"} Dec 01 15:12:54 crc kubenswrapper[4931]: I1201 15:12:54.580650 4931 scope.go:117] "RemoveContainer" containerID="056974f62446b2d5a4459d1c9bfb7a61917b2d482eabdca780647445c6865991" Dec 01 15:12:55 crc kubenswrapper[4931]: I1201 15:12:55.400291 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" event={"ID":"012a89b1-3f84-4485-a6ad-fc33168d8726","Type":"ContainerStarted","Data":"09e0d4a3560046ffbd8774d81592276dc75600bf8a2e751668fa9956e5456344"} Dec 01 15:12:55 crc kubenswrapper[4931]: I1201 15:12:55.401553 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:55 crc kubenswrapper[4931]: I1201 15:12:55.401586 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:55 crc kubenswrapper[4931]: I1201 15:12:55.401633 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:55 crc kubenswrapper[4931]: I1201 15:12:55.405563 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6nwqj_db092a9c-f0f2-401d-82dd-b3af535585cc/kube-multus/2.log" Dec 01 15:12:55 crc kubenswrapper[4931]: I1201 15:12:55.436589 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" podStartSLOduration=7.436516028 podStartE2EDuration="7.436516028s" podCreationTimestamp="2025-12-01 15:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:12:55.435511848 +0000 UTC m=+721.861385525" watchObservedRunningTime="2025-12-01 15:12:55.436516028 +0000 UTC m=+721.862389705" Dec 01 15:12:55 crc kubenswrapper[4931]: I1201 15:12:55.461961 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:12:55 crc kubenswrapper[4931]: I1201 15:12:55.464243 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:13:03 crc kubenswrapper[4931]: I1201 15:13:03.241538 4931 scope.go:117] "RemoveContainer" containerID="58e0cafadf10e2f6c28ad954b6ef10668446085bb039d922999d395643c4d133" Dec 01 15:13:03 crc kubenswrapper[4931]: E1201 15:13:03.242842 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6nwqj_openshift-multus(db092a9c-f0f2-401d-82dd-b3af535585cc)\"" pod="openshift-multus/multus-6nwqj" podUID="db092a9c-f0f2-401d-82dd-b3af535585cc" Dec 01 15:13:16 crc kubenswrapper[4931]: I1201 15:13:16.241792 4931 scope.go:117] "RemoveContainer" containerID="58e0cafadf10e2f6c28ad954b6ef10668446085bb039d922999d395643c4d133" Dec 01 15:13:16 crc kubenswrapper[4931]: I1201 15:13:16.549243 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6nwqj_db092a9c-f0f2-401d-82dd-b3af535585cc/kube-multus/2.log" Dec 01 15:13:16 crc kubenswrapper[4931]: I1201 15:13:16.549661 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6nwqj" event={"ID":"db092a9c-f0f2-401d-82dd-b3af535585cc","Type":"ContainerStarted","Data":"150baf0d59412028e65350ad48d14ffb2c032a8e00d18155dcb5f2accaeaa1cd"} Dec 01 15:13:19 crc kubenswrapper[4931]: I1201 15:13:19.157726 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nrtsj" Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.066659 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt"] Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.068623 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.070378 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.075535 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt"] Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.189524 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f06697c2-e5a8-47a0-b960-d71e2d3c591a-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt\" (UID: \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.189592 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f06697c2-e5a8-47a0-b960-d71e2d3c591a-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt\" (UID: \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.189713 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tt6d\" (UniqueName: \"kubernetes.io/projected/f06697c2-e5a8-47a0-b960-d71e2d3c591a-kube-api-access-6tt6d\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt\" (UID: \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.291057 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tt6d\" (UniqueName: \"kubernetes.io/projected/f06697c2-e5a8-47a0-b960-d71e2d3c591a-kube-api-access-6tt6d\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt\" (UID: \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.291175 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f06697c2-e5a8-47a0-b960-d71e2d3c591a-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt\" (UID: \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.291207 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f06697c2-e5a8-47a0-b960-d71e2d3c591a-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt\" (UID: \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.291711 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f06697c2-e5a8-47a0-b960-d71e2d3c591a-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt\" (UID: \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.292131 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f06697c2-e5a8-47a0-b960-d71e2d3c591a-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt\" (UID: \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.310161 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tt6d\" (UniqueName: \"kubernetes.io/projected/f06697c2-e5a8-47a0-b960-d71e2d3c591a-kube-api-access-6tt6d\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt\" (UID: \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.389764 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.596978 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt"] Dec 01 15:13:30 crc kubenswrapper[4931]: W1201 15:13:30.604130 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf06697c2_e5a8_47a0_b960_d71e2d3c591a.slice/crio-0c2fd4ee49b1c97959f510399d9900bb74cbd2610a8a2016ff3de51a5861d604 WatchSource:0}: Error finding container 0c2fd4ee49b1c97959f510399d9900bb74cbd2610a8a2016ff3de51a5861d604: Status 404 returned error can't find the container with id 0c2fd4ee49b1c97959f510399d9900bb74cbd2610a8a2016ff3de51a5861d604 Dec 01 15:13:30 crc kubenswrapper[4931]: I1201 15:13:30.637406 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" event={"ID":"f06697c2-e5a8-47a0-b960-d71e2d3c591a","Type":"ContainerStarted","Data":"0c2fd4ee49b1c97959f510399d9900bb74cbd2610a8a2016ff3de51a5861d604"} Dec 01 15:13:31 crc kubenswrapper[4931]: I1201 15:13:31.648611 4931 generic.go:334] "Generic (PLEG): container finished" podID="f06697c2-e5a8-47a0-b960-d71e2d3c591a" containerID="efbf5952a2b9b506c2da198c4166383b36c6705e55e12976e82a9cb2e400a1ad" exitCode=0 Dec 01 15:13:31 crc kubenswrapper[4931]: I1201 15:13:31.648725 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" event={"ID":"f06697c2-e5a8-47a0-b960-d71e2d3c591a","Type":"ContainerDied","Data":"efbf5952a2b9b506c2da198c4166383b36c6705e55e12976e82a9cb2e400a1ad"} Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.308478 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f7v66"] Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.310888 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.327311 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7v66"] Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.422799 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbss8\" (UniqueName: \"kubernetes.io/projected/6b10c645-d72b-4fe8-8ae0-9f169326fed4-kube-api-access-mbss8\") pod \"redhat-operators-f7v66\" (UID: \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\") " pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.422862 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b10c645-d72b-4fe8-8ae0-9f169326fed4-catalog-content\") pod \"redhat-operators-f7v66\" (UID: \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\") " pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.422890 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b10c645-d72b-4fe8-8ae0-9f169326fed4-utilities\") pod \"redhat-operators-f7v66\" (UID: \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\") " pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.524724 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbss8\" (UniqueName: \"kubernetes.io/projected/6b10c645-d72b-4fe8-8ae0-9f169326fed4-kube-api-access-mbss8\") pod \"redhat-operators-f7v66\" (UID: \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\") " pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.524861 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b10c645-d72b-4fe8-8ae0-9f169326fed4-catalog-content\") pod \"redhat-operators-f7v66\" (UID: \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\") " pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.524902 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b10c645-d72b-4fe8-8ae0-9f169326fed4-utilities\") pod \"redhat-operators-f7v66\" (UID: \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\") " pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.525565 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b10c645-d72b-4fe8-8ae0-9f169326fed4-catalog-content\") pod \"redhat-operators-f7v66\" (UID: \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\") " pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.525699 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b10c645-d72b-4fe8-8ae0-9f169326fed4-utilities\") pod \"redhat-operators-f7v66\" (UID: \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\") " pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.561835 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbss8\" (UniqueName: \"kubernetes.io/projected/6b10c645-d72b-4fe8-8ae0-9f169326fed4-kube-api-access-mbss8\") pod \"redhat-operators-f7v66\" (UID: \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\") " pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.638036 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:32 crc kubenswrapper[4931]: I1201 15:13:32.873904 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7v66"] Dec 01 15:13:32 crc kubenswrapper[4931]: W1201 15:13:32.884064 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b10c645_d72b_4fe8_8ae0_9f169326fed4.slice/crio-8b46837c3ed2f993df087f424818feaee2cc6556c98400d8c57321f57ae1ec18 WatchSource:0}: Error finding container 8b46837c3ed2f993df087f424818feaee2cc6556c98400d8c57321f57ae1ec18: Status 404 returned error can't find the container with id 8b46837c3ed2f993df087f424818feaee2cc6556c98400d8c57321f57ae1ec18 Dec 01 15:13:33 crc kubenswrapper[4931]: I1201 15:13:33.662072 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7v66" event={"ID":"6b10c645-d72b-4fe8-8ae0-9f169326fed4","Type":"ContainerStarted","Data":"142a11cf01d0408a03c71b7bee2ff0c05363de8de4b1cd051a8882b290f4de2b"} Dec 01 15:13:33 crc kubenswrapper[4931]: I1201 15:13:33.662125 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7v66" event={"ID":"6b10c645-d72b-4fe8-8ae0-9f169326fed4","Type":"ContainerStarted","Data":"8b46837c3ed2f993df087f424818feaee2cc6556c98400d8c57321f57ae1ec18"} Dec 01 15:13:34 crc kubenswrapper[4931]: I1201 15:13:34.670759 4931 generic.go:334] "Generic (PLEG): container finished" podID="6b10c645-d72b-4fe8-8ae0-9f169326fed4" containerID="142a11cf01d0408a03c71b7bee2ff0c05363de8de4b1cd051a8882b290f4de2b" exitCode=0 Dec 01 15:13:34 crc kubenswrapper[4931]: I1201 15:13:34.671332 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7v66" event={"ID":"6b10c645-d72b-4fe8-8ae0-9f169326fed4","Type":"ContainerDied","Data":"142a11cf01d0408a03c71b7bee2ff0c05363de8de4b1cd051a8882b290f4de2b"} Dec 01 15:13:34 crc kubenswrapper[4931]: I1201 15:13:34.809588 4931 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 15:13:35 crc kubenswrapper[4931]: I1201 15:13:35.679223 4931 generic.go:334] "Generic (PLEG): container finished" podID="f06697c2-e5a8-47a0-b960-d71e2d3c591a" containerID="a293198a0880f7c8f93f2661ed7554e6f56b15a217632f6da22b2a5bea083797" exitCode=0 Dec 01 15:13:35 crc kubenswrapper[4931]: I1201 15:13:35.679276 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" event={"ID":"f06697c2-e5a8-47a0-b960-d71e2d3c591a","Type":"ContainerDied","Data":"a293198a0880f7c8f93f2661ed7554e6f56b15a217632f6da22b2a5bea083797"} Dec 01 15:13:36 crc kubenswrapper[4931]: I1201 15:13:36.687731 4931 generic.go:334] "Generic (PLEG): container finished" podID="6b10c645-d72b-4fe8-8ae0-9f169326fed4" containerID="ec3a47abfaa8a4c128ad6c7f6ca03be931d2adf191614c28cc07575af25e0752" exitCode=0 Dec 01 15:13:36 crc kubenswrapper[4931]: I1201 15:13:36.687816 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7v66" event={"ID":"6b10c645-d72b-4fe8-8ae0-9f169326fed4","Type":"ContainerDied","Data":"ec3a47abfaa8a4c128ad6c7f6ca03be931d2adf191614c28cc07575af25e0752"} Dec 01 15:13:36 crc kubenswrapper[4931]: I1201 15:13:36.693569 4931 generic.go:334] "Generic (PLEG): container finished" podID="f06697c2-e5a8-47a0-b960-d71e2d3c591a" containerID="7a147b624f195ae38d7a4d630d90708634b53f21e478d3a8e7fe55ac0a729453" exitCode=0 Dec 01 15:13:36 crc kubenswrapper[4931]: I1201 15:13:36.693625 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" event={"ID":"f06697c2-e5a8-47a0-b960-d71e2d3c591a","Type":"ContainerDied","Data":"7a147b624f195ae38d7a4d630d90708634b53f21e478d3a8e7fe55ac0a729453"} Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.100824 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p4xwh"] Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.101904 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.127752 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4xwh"] Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.196916 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-catalog-content\") pod \"certified-operators-p4xwh\" (UID: \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\") " pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.197007 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgngv\" (UniqueName: \"kubernetes.io/projected/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-kube-api-access-wgngv\") pod \"certified-operators-p4xwh\" (UID: \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\") " pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.197135 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-utilities\") pod \"certified-operators-p4xwh\" (UID: \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\") " pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.298315 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgngv\" (UniqueName: \"kubernetes.io/projected/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-kube-api-access-wgngv\") pod \"certified-operators-p4xwh\" (UID: \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\") " pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.298465 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-utilities\") pod \"certified-operators-p4xwh\" (UID: \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\") " pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.298617 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-catalog-content\") pod \"certified-operators-p4xwh\" (UID: \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\") " pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.299583 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-catalog-content\") pod \"certified-operators-p4xwh\" (UID: \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\") " pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.299572 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-utilities\") pod \"certified-operators-p4xwh\" (UID: \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\") " pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.328541 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgngv\" (UniqueName: \"kubernetes.io/projected/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-kube-api-access-wgngv\") pod \"certified-operators-p4xwh\" (UID: \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\") " pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.417091 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.759703 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4xwh"] Dec 01 15:13:37 crc kubenswrapper[4931]: I1201 15:13:37.976209 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.121123 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f06697c2-e5a8-47a0-b960-d71e2d3c591a-util\") pod \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\" (UID: \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\") " Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.121789 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f06697c2-e5a8-47a0-b960-d71e2d3c591a-bundle\") pod \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\" (UID: \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\") " Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.121933 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tt6d\" (UniqueName: \"kubernetes.io/projected/f06697c2-e5a8-47a0-b960-d71e2d3c591a-kube-api-access-6tt6d\") pod \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\" (UID: \"f06697c2-e5a8-47a0-b960-d71e2d3c591a\") " Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.122492 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f06697c2-e5a8-47a0-b960-d71e2d3c591a-bundle" (OuterVolumeSpecName: "bundle") pod "f06697c2-e5a8-47a0-b960-d71e2d3c591a" (UID: "f06697c2-e5a8-47a0-b960-d71e2d3c591a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.127346 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f06697c2-e5a8-47a0-b960-d71e2d3c591a-kube-api-access-6tt6d" (OuterVolumeSpecName: "kube-api-access-6tt6d") pod "f06697c2-e5a8-47a0-b960-d71e2d3c591a" (UID: "f06697c2-e5a8-47a0-b960-d71e2d3c591a"). InnerVolumeSpecName "kube-api-access-6tt6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.142255 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f06697c2-e5a8-47a0-b960-d71e2d3c591a-util" (OuterVolumeSpecName: "util") pod "f06697c2-e5a8-47a0-b960-d71e2d3c591a" (UID: "f06697c2-e5a8-47a0-b960-d71e2d3c591a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.223308 4931 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f06697c2-e5a8-47a0-b960-d71e2d3c591a-util\") on node \"crc\" DevicePath \"\"" Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.223349 4931 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f06697c2-e5a8-47a0-b960-d71e2d3c591a-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.223358 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tt6d\" (UniqueName: \"kubernetes.io/projected/f06697c2-e5a8-47a0-b960-d71e2d3c591a-kube-api-access-6tt6d\") on node \"crc\" DevicePath \"\"" Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.721937 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" event={"ID":"f06697c2-e5a8-47a0-b960-d71e2d3c591a","Type":"ContainerDied","Data":"0c2fd4ee49b1c97959f510399d9900bb74cbd2610a8a2016ff3de51a5861d604"} Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.722006 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c2fd4ee49b1c97959f510399d9900bb74cbd2610a8a2016ff3de51a5861d604" Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.721956 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt" Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.724422 4931 generic.go:334] "Generic (PLEG): container finished" podID="d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" containerID="b429639f21010e4ae767c8c99c14d6d08361acf67db37745e50b612177af2c22" exitCode=0 Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.724493 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4xwh" event={"ID":"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda","Type":"ContainerDied","Data":"b429639f21010e4ae767c8c99c14d6d08361acf67db37745e50b612177af2c22"} Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.724515 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4xwh" event={"ID":"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda","Type":"ContainerStarted","Data":"2dca7df5c5b38ae303f0926543eb39034cd641abfe6434e53c0d5a9b3a3abd2a"} Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.728445 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7v66" event={"ID":"6b10c645-d72b-4fe8-8ae0-9f169326fed4","Type":"ContainerStarted","Data":"3663313b69f218d7c977c0c7f5f60e08e1e37d71c9655144b6c8ff328c0d8d48"} Dec 01 15:13:38 crc kubenswrapper[4931]: I1201 15:13:38.785281 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f7v66" podStartSLOduration=3.270199908 podStartE2EDuration="6.785236738s" podCreationTimestamp="2025-12-01 15:13:32 +0000 UTC" firstStartedPulling="2025-12-01 15:13:34.67662694 +0000 UTC m=+761.102500617" lastFinishedPulling="2025-12-01 15:13:38.19166378 +0000 UTC m=+764.617537447" observedRunningTime="2025-12-01 15:13:38.771760802 +0000 UTC m=+765.197634509" watchObservedRunningTime="2025-12-01 15:13:38.785236738 +0000 UTC m=+765.211110445" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.190344 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-h5j95"] Dec 01 15:13:41 crc kubenswrapper[4931]: E1201 15:13:41.190920 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f06697c2-e5a8-47a0-b960-d71e2d3c591a" containerName="extract" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.190935 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f06697c2-e5a8-47a0-b960-d71e2d3c591a" containerName="extract" Dec 01 15:13:41 crc kubenswrapper[4931]: E1201 15:13:41.190947 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f06697c2-e5a8-47a0-b960-d71e2d3c591a" containerName="pull" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.190953 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f06697c2-e5a8-47a0-b960-d71e2d3c591a" containerName="pull" Dec 01 15:13:41 crc kubenswrapper[4931]: E1201 15:13:41.190970 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f06697c2-e5a8-47a0-b960-d71e2d3c591a" containerName="util" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.190976 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f06697c2-e5a8-47a0-b960-d71e2d3c591a" containerName="util" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.191074 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f06697c2-e5a8-47a0-b960-d71e2d3c591a" containerName="extract" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.191483 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-h5j95" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.194130 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-xxx7v" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.195158 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.195574 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.206670 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-h5j95"] Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.268982 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slxvd\" (UniqueName: \"kubernetes.io/projected/915c495c-c7fa-4c00-ad7e-03d0e7ba74c9-kube-api-access-slxvd\") pod \"nmstate-operator-5b5b58f5c8-h5j95\" (UID: \"915c495c-c7fa-4c00-ad7e-03d0e7ba74c9\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-h5j95" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.370438 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slxvd\" (UniqueName: \"kubernetes.io/projected/915c495c-c7fa-4c00-ad7e-03d0e7ba74c9-kube-api-access-slxvd\") pod \"nmstate-operator-5b5b58f5c8-h5j95\" (UID: \"915c495c-c7fa-4c00-ad7e-03d0e7ba74c9\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-h5j95" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.404341 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slxvd\" (UniqueName: \"kubernetes.io/projected/915c495c-c7fa-4c00-ad7e-03d0e7ba74c9-kube-api-access-slxvd\") pod \"nmstate-operator-5b5b58f5c8-h5j95\" (UID: \"915c495c-c7fa-4c00-ad7e-03d0e7ba74c9\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-h5j95" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.506759 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-h5j95" Dec 01 15:13:41 crc kubenswrapper[4931]: I1201 15:13:41.992594 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-h5j95"] Dec 01 15:13:42 crc kubenswrapper[4931]: W1201 15:13:42.014896 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod915c495c_c7fa_4c00_ad7e_03d0e7ba74c9.slice/crio-b257c38caa5fd4362665f592eb7664dd1167217112709373a21b3f95d14349be WatchSource:0}: Error finding container b257c38caa5fd4362665f592eb7664dd1167217112709373a21b3f95d14349be: Status 404 returned error can't find the container with id b257c38caa5fd4362665f592eb7664dd1167217112709373a21b3f95d14349be Dec 01 15:13:42 crc kubenswrapper[4931]: I1201 15:13:42.639079 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:42 crc kubenswrapper[4931]: I1201 15:13:42.639142 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:42 crc kubenswrapper[4931]: I1201 15:13:42.761879 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-h5j95" event={"ID":"915c495c-c7fa-4c00-ad7e-03d0e7ba74c9","Type":"ContainerStarted","Data":"b257c38caa5fd4362665f592eb7664dd1167217112709373a21b3f95d14349be"} Dec 01 15:13:43 crc kubenswrapper[4931]: I1201 15:13:43.689097 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f7v66" podUID="6b10c645-d72b-4fe8-8ae0-9f169326fed4" containerName="registry-server" probeResult="failure" output=< Dec 01 15:13:43 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Dec 01 15:13:43 crc kubenswrapper[4931]: > Dec 01 15:13:46 crc kubenswrapper[4931]: I1201 15:13:46.787124 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4xwh" event={"ID":"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda","Type":"ContainerStarted","Data":"c24a3461e9a8edd0d2941e2e5eb12fb04824588b23c0a55122eb8c07b60b3655"} Dec 01 15:13:47 crc kubenswrapper[4931]: I1201 15:13:47.798207 4931 generic.go:334] "Generic (PLEG): container finished" podID="d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" containerID="c24a3461e9a8edd0d2941e2e5eb12fb04824588b23c0a55122eb8c07b60b3655" exitCode=0 Dec 01 15:13:47 crc kubenswrapper[4931]: I1201 15:13:47.798319 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4xwh" event={"ID":"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda","Type":"ContainerDied","Data":"c24a3461e9a8edd0d2941e2e5eb12fb04824588b23c0a55122eb8c07b60b3655"} Dec 01 15:13:49 crc kubenswrapper[4931]: I1201 15:13:49.813794 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-h5j95" event={"ID":"915c495c-c7fa-4c00-ad7e-03d0e7ba74c9","Type":"ContainerStarted","Data":"0301a0e3f10e6899f6c65a5db58131ac37d0b39411c2ca435a99d4fd721718b9"} Dec 01 15:13:49 crc kubenswrapper[4931]: I1201 15:13:49.837910 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-h5j95" podStartSLOduration=1.755436209 podStartE2EDuration="8.837889708s" podCreationTimestamp="2025-12-01 15:13:41 +0000 UTC" firstStartedPulling="2025-12-01 15:13:42.019539621 +0000 UTC m=+768.445413278" lastFinishedPulling="2025-12-01 15:13:49.1019931 +0000 UTC m=+775.527866777" observedRunningTime="2025-12-01 15:13:49.832757107 +0000 UTC m=+776.258630774" watchObservedRunningTime="2025-12-01 15:13:49.837889708 +0000 UTC m=+776.263763375" Dec 01 15:13:49 crc kubenswrapper[4931]: I1201 15:13:49.872192 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:13:49 crc kubenswrapper[4931]: I1201 15:13:49.872276 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:13:50 crc kubenswrapper[4931]: I1201 15:13:50.822523 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4xwh" event={"ID":"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda","Type":"ContainerStarted","Data":"690b014f49b18347548e87eacd9c0395fdaa5dcdae096626430a44f2ec7d0c12"} Dec 01 15:13:50 crc kubenswrapper[4931]: I1201 15:13:50.874151 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-4xwr7"] Dec 01 15:13:50 crc kubenswrapper[4931]: I1201 15:13:50.882709 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4xwr7" Dec 01 15:13:50 crc kubenswrapper[4931]: I1201 15:13:50.889053 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-jb6x6" Dec 01 15:13:50 crc kubenswrapper[4931]: I1201 15:13:50.889998 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8"] Dec 01 15:13:50 crc kubenswrapper[4931]: I1201 15:13:50.891223 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8" Dec 01 15:13:50 crc kubenswrapper[4931]: I1201 15:13:50.893748 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 01 15:13:50 crc kubenswrapper[4931]: I1201 15:13:50.904568 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-4xwr7"] Dec 01 15:13:50 crc kubenswrapper[4931]: I1201 15:13:50.921457 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p4xwh" podStartSLOduration=2.60335598 podStartE2EDuration="13.921427748s" podCreationTimestamp="2025-12-01 15:13:37 +0000 UTC" firstStartedPulling="2025-12-01 15:13:38.727037688 +0000 UTC m=+765.152911365" lastFinishedPulling="2025-12-01 15:13:50.045109456 +0000 UTC m=+776.470983133" observedRunningTime="2025-12-01 15:13:50.879237509 +0000 UTC m=+777.305111176" watchObservedRunningTime="2025-12-01 15:13:50.921427748 +0000 UTC m=+777.347301415" Dec 01 15:13:50 crc kubenswrapper[4931]: I1201 15:13:50.938204 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnsnb\" (UniqueName: \"kubernetes.io/projected/8be2b09a-fed1-4f7f-9424-5ca00a814c3d-kube-api-access-vnsnb\") pod \"nmstate-metrics-7f946cbc9-4xwr7\" (UID: \"8be2b09a-fed1-4f7f-9424-5ca00a814c3d\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4xwr7" Dec 01 15:13:50 crc kubenswrapper[4931]: I1201 15:13:50.945587 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-8ljjr"] Dec 01 15:13:50 crc kubenswrapper[4931]: I1201 15:13:50.946895 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:50 crc kubenswrapper[4931]: I1201 15:13:50.959443 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8"] Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.019677 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw"] Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.020644 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.022769 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.023029 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-gqx66" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.024791 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.035031 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw"] Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.038999 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fd514864-e79c-4cd7-9517-0b3f9fbee078-ovs-socket\") pod \"nmstate-handler-8ljjr\" (UID: \"fd514864-e79c-4cd7-9517-0b3f9fbee078\") " pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.039073 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnsnb\" (UniqueName: \"kubernetes.io/projected/8be2b09a-fed1-4f7f-9424-5ca00a814c3d-kube-api-access-vnsnb\") pod \"nmstate-metrics-7f946cbc9-4xwr7\" (UID: \"8be2b09a-fed1-4f7f-9424-5ca00a814c3d\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4xwr7" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.039163 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfhk7\" (UniqueName: \"kubernetes.io/projected/f821ad14-f34f-49cb-a884-905cf0219454-kube-api-access-wfhk7\") pod \"nmstate-webhook-5f6d4c5ccb-5xlg8\" (UID: \"f821ad14-f34f-49cb-a884-905cf0219454\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.039256 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fd514864-e79c-4cd7-9517-0b3f9fbee078-nmstate-lock\") pod \"nmstate-handler-8ljjr\" (UID: \"fd514864-e79c-4cd7-9517-0b3f9fbee078\") " pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.039275 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f821ad14-f34f-49cb-a884-905cf0219454-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-5xlg8\" (UID: \"f821ad14-f34f-49cb-a884-905cf0219454\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.039301 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fd514864-e79c-4cd7-9517-0b3f9fbee078-dbus-socket\") pod \"nmstate-handler-8ljjr\" (UID: \"fd514864-e79c-4cd7-9517-0b3f9fbee078\") " pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.039395 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7mpq\" (UniqueName: \"kubernetes.io/projected/fd514864-e79c-4cd7-9517-0b3f9fbee078-kube-api-access-n7mpq\") pod \"nmstate-handler-8ljjr\" (UID: \"fd514864-e79c-4cd7-9517-0b3f9fbee078\") " pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.062929 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnsnb\" (UniqueName: \"kubernetes.io/projected/8be2b09a-fed1-4f7f-9424-5ca00a814c3d-kube-api-access-vnsnb\") pod \"nmstate-metrics-7f946cbc9-4xwr7\" (UID: \"8be2b09a-fed1-4f7f-9424-5ca00a814c3d\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4xwr7" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.141923 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fd514864-e79c-4cd7-9517-0b3f9fbee078-dbus-socket\") pod \"nmstate-handler-8ljjr\" (UID: \"fd514864-e79c-4cd7-9517-0b3f9fbee078\") " pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.142178 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fd514864-e79c-4cd7-9517-0b3f9fbee078-dbus-socket\") pod \"nmstate-handler-8ljjr\" (UID: \"fd514864-e79c-4cd7-9517-0b3f9fbee078\") " pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.142252 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27nw9\" (UniqueName: \"kubernetes.io/projected/a5b7cbc9-6afb-414c-ac6e-569b6d9634ed-kube-api-access-27nw9\") pod \"nmstate-console-plugin-7fbb5f6569-lt2nw\" (UID: \"a5b7cbc9-6afb-414c-ac6e-569b6d9634ed\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.142321 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7mpq\" (UniqueName: \"kubernetes.io/projected/fd514864-e79c-4cd7-9517-0b3f9fbee078-kube-api-access-n7mpq\") pod \"nmstate-handler-8ljjr\" (UID: \"fd514864-e79c-4cd7-9517-0b3f9fbee078\") " pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.142413 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5b7cbc9-6afb-414c-ac6e-569b6d9634ed-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-lt2nw\" (UID: \"a5b7cbc9-6afb-414c-ac6e-569b6d9634ed\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.142525 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fd514864-e79c-4cd7-9517-0b3f9fbee078-ovs-socket\") pod \"nmstate-handler-8ljjr\" (UID: \"fd514864-e79c-4cd7-9517-0b3f9fbee078\") " pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.142571 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a5b7cbc9-6afb-414c-ac6e-569b6d9634ed-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-lt2nw\" (UID: \"a5b7cbc9-6afb-414c-ac6e-569b6d9634ed\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.142621 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fd514864-e79c-4cd7-9517-0b3f9fbee078-ovs-socket\") pod \"nmstate-handler-8ljjr\" (UID: \"fd514864-e79c-4cd7-9517-0b3f9fbee078\") " pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.142665 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfhk7\" (UniqueName: \"kubernetes.io/projected/f821ad14-f34f-49cb-a884-905cf0219454-kube-api-access-wfhk7\") pod \"nmstate-webhook-5f6d4c5ccb-5xlg8\" (UID: \"f821ad14-f34f-49cb-a884-905cf0219454\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.142708 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fd514864-e79c-4cd7-9517-0b3f9fbee078-nmstate-lock\") pod \"nmstate-handler-8ljjr\" (UID: \"fd514864-e79c-4cd7-9517-0b3f9fbee078\") " pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.142724 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f821ad14-f34f-49cb-a884-905cf0219454-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-5xlg8\" (UID: \"f821ad14-f34f-49cb-a884-905cf0219454\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.142776 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fd514864-e79c-4cd7-9517-0b3f9fbee078-nmstate-lock\") pod \"nmstate-handler-8ljjr\" (UID: \"fd514864-e79c-4cd7-9517-0b3f9fbee078\") " pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.146959 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f821ad14-f34f-49cb-a884-905cf0219454-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-5xlg8\" (UID: \"f821ad14-f34f-49cb-a884-905cf0219454\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.165687 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7mpq\" (UniqueName: \"kubernetes.io/projected/fd514864-e79c-4cd7-9517-0b3f9fbee078-kube-api-access-n7mpq\") pod \"nmstate-handler-8ljjr\" (UID: \"fd514864-e79c-4cd7-9517-0b3f9fbee078\") " pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.172649 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfhk7\" (UniqueName: \"kubernetes.io/projected/f821ad14-f34f-49cb-a884-905cf0219454-kube-api-access-wfhk7\") pod \"nmstate-webhook-5f6d4c5ccb-5xlg8\" (UID: \"f821ad14-f34f-49cb-a884-905cf0219454\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.218003 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-858544c664-mmdhk"] Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.219074 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.224831 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4xwr7" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.235402 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-858544c664-mmdhk"] Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.242332 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.257020 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5b7cbc9-6afb-414c-ac6e-569b6d9634ed-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-lt2nw\" (UID: \"a5b7cbc9-6afb-414c-ac6e-569b6d9634ed\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.257072 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a5b7cbc9-6afb-414c-ac6e-569b6d9634ed-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-lt2nw\" (UID: \"a5b7cbc9-6afb-414c-ac6e-569b6d9634ed\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.257143 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27nw9\" (UniqueName: \"kubernetes.io/projected/a5b7cbc9-6afb-414c-ac6e-569b6d9634ed-kube-api-access-27nw9\") pod \"nmstate-console-plugin-7fbb5f6569-lt2nw\" (UID: \"a5b7cbc9-6afb-414c-ac6e-569b6d9634ed\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.258704 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a5b7cbc9-6afb-414c-ac6e-569b6d9634ed-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-lt2nw\" (UID: \"a5b7cbc9-6afb-414c-ac6e-569b6d9634ed\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.261327 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5b7cbc9-6afb-414c-ac6e-569b6d9634ed-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-lt2nw\" (UID: \"a5b7cbc9-6afb-414c-ac6e-569b6d9634ed\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.270012 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.282805 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27nw9\" (UniqueName: \"kubernetes.io/projected/a5b7cbc9-6afb-414c-ac6e-569b6d9634ed-kube-api-access-27nw9\") pod \"nmstate-console-plugin-7fbb5f6569-lt2nw\" (UID: \"a5b7cbc9-6afb-414c-ac6e-569b6d9634ed\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" Dec 01 15:13:51 crc kubenswrapper[4931]: W1201 15:13:51.292033 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd514864_e79c_4cd7_9517_0b3f9fbee078.slice/crio-24ed233339a976ad8296975ba0da2b57e62580bff55f0bc45c398802dd420c3d WatchSource:0}: Error finding container 24ed233339a976ad8296975ba0da2b57e62580bff55f0bc45c398802dd420c3d: Status 404 returned error can't find the container with id 24ed233339a976ad8296975ba0da2b57e62580bff55f0bc45c398802dd420c3d Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.339942 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.359065 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4386901-043d-4dda-a5ac-02d86b5a5906-console-serving-cert\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.359498 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw5bf\" (UniqueName: \"kubernetes.io/projected/e4386901-043d-4dda-a5ac-02d86b5a5906-kube-api-access-tw5bf\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.359530 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4386901-043d-4dda-a5ac-02d86b5a5906-trusted-ca-bundle\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.359561 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4386901-043d-4dda-a5ac-02d86b5a5906-console-config\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.359591 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4386901-043d-4dda-a5ac-02d86b5a5906-oauth-serving-cert\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.359608 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4386901-043d-4dda-a5ac-02d86b5a5906-console-oauth-config\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.359628 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4386901-043d-4dda-a5ac-02d86b5a5906-service-ca\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.460640 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4386901-043d-4dda-a5ac-02d86b5a5906-trusted-ca-bundle\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.460692 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4386901-043d-4dda-a5ac-02d86b5a5906-console-config\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.460731 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4386901-043d-4dda-a5ac-02d86b5a5906-oauth-serving-cert\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.460756 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4386901-043d-4dda-a5ac-02d86b5a5906-console-oauth-config\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.460777 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4386901-043d-4dda-a5ac-02d86b5a5906-service-ca\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.460837 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4386901-043d-4dda-a5ac-02d86b5a5906-console-serving-cert\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.460862 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw5bf\" (UniqueName: \"kubernetes.io/projected/e4386901-043d-4dda-a5ac-02d86b5a5906-kube-api-access-tw5bf\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.461926 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e4386901-043d-4dda-a5ac-02d86b5a5906-console-config\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.462139 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e4386901-043d-4dda-a5ac-02d86b5a5906-service-ca\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.463053 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4386901-043d-4dda-a5ac-02d86b5a5906-trusted-ca-bundle\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.463347 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e4386901-043d-4dda-a5ac-02d86b5a5906-oauth-serving-cert\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.469978 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4386901-043d-4dda-a5ac-02d86b5a5906-console-serving-cert\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.470258 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e4386901-043d-4dda-a5ac-02d86b5a5906-console-oauth-config\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.478055 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw5bf\" (UniqueName: \"kubernetes.io/projected/e4386901-043d-4dda-a5ac-02d86b5a5906-kube-api-access-tw5bf\") pod \"console-858544c664-mmdhk\" (UID: \"e4386901-043d-4dda-a5ac-02d86b5a5906\") " pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.567272 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.599498 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw"] Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.687490 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-4xwr7"] Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.729370 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8"] Dec 01 15:13:51 crc kubenswrapper[4931]: W1201 15:13:51.734497 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf821ad14_f34f_49cb_a884_905cf0219454.slice/crio-d6e438d8b383381e91b701913bb4ba69207374e3450bbc4511dbcdd680c14bf1 WatchSource:0}: Error finding container d6e438d8b383381e91b701913bb4ba69207374e3450bbc4511dbcdd680c14bf1: Status 404 returned error can't find the container with id d6e438d8b383381e91b701913bb4ba69207374e3450bbc4511dbcdd680c14bf1 Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.828877 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4xwr7" event={"ID":"8be2b09a-fed1-4f7f-9424-5ca00a814c3d","Type":"ContainerStarted","Data":"9c59df8a20e4d0024424286032fa7b40567a31ef196defda2548cf95d7dc0664"} Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.829752 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" event={"ID":"a5b7cbc9-6afb-414c-ac6e-569b6d9634ed","Type":"ContainerStarted","Data":"6549b742ce001e0202f5489daa84f463f70d187e1320025dee36b6dd274c8e17"} Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.831016 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8ljjr" event={"ID":"fd514864-e79c-4cd7-9517-0b3f9fbee078","Type":"ContainerStarted","Data":"24ed233339a976ad8296975ba0da2b57e62580bff55f0bc45c398802dd420c3d"} Dec 01 15:13:51 crc kubenswrapper[4931]: I1201 15:13:51.834237 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8" event={"ID":"f821ad14-f34f-49cb-a884-905cf0219454","Type":"ContainerStarted","Data":"d6e438d8b383381e91b701913bb4ba69207374e3450bbc4511dbcdd680c14bf1"} Dec 01 15:13:52 crc kubenswrapper[4931]: I1201 15:13:52.059523 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-858544c664-mmdhk"] Dec 01 15:13:52 crc kubenswrapper[4931]: W1201 15:13:52.068251 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4386901_043d_4dda_a5ac_02d86b5a5906.slice/crio-7a7709aa84ee54edf1358832a8f2dde3bdbe4b29d4be36cc14d5981212806bda WatchSource:0}: Error finding container 7a7709aa84ee54edf1358832a8f2dde3bdbe4b29d4be36cc14d5981212806bda: Status 404 returned error can't find the container with id 7a7709aa84ee54edf1358832a8f2dde3bdbe4b29d4be36cc14d5981212806bda Dec 01 15:13:52 crc kubenswrapper[4931]: I1201 15:13:52.689854 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:52 crc kubenswrapper[4931]: I1201 15:13:52.749341 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:52 crc kubenswrapper[4931]: I1201 15:13:52.842632 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-858544c664-mmdhk" event={"ID":"e4386901-043d-4dda-a5ac-02d86b5a5906","Type":"ContainerStarted","Data":"155c7d43a95b8b992423480d056a5f2b45266b16c1b0441c254970d107a1a733"} Dec 01 15:13:52 crc kubenswrapper[4931]: I1201 15:13:52.842724 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-858544c664-mmdhk" event={"ID":"e4386901-043d-4dda-a5ac-02d86b5a5906","Type":"ContainerStarted","Data":"7a7709aa84ee54edf1358832a8f2dde3bdbe4b29d4be36cc14d5981212806bda"} Dec 01 15:13:52 crc kubenswrapper[4931]: I1201 15:13:52.869126 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-858544c664-mmdhk" podStartSLOduration=1.8690935149999999 podStartE2EDuration="1.869093515s" podCreationTimestamp="2025-12-01 15:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:13:52.868855078 +0000 UTC m=+779.294728755" watchObservedRunningTime="2025-12-01 15:13:52.869093515 +0000 UTC m=+779.294967192" Dec 01 15:13:53 crc kubenswrapper[4931]: I1201 15:13:53.099527 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f7v66"] Dec 01 15:13:53 crc kubenswrapper[4931]: I1201 15:13:53.851784 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f7v66" podUID="6b10c645-d72b-4fe8-8ae0-9f169326fed4" containerName="registry-server" containerID="cri-o://3663313b69f218d7c977c0c7f5f60e08e1e37d71c9655144b6c8ff328c0d8d48" gracePeriod=2 Dec 01 15:13:54 crc kubenswrapper[4931]: I1201 15:13:54.859983 4931 generic.go:334] "Generic (PLEG): container finished" podID="6b10c645-d72b-4fe8-8ae0-9f169326fed4" containerID="3663313b69f218d7c977c0c7f5f60e08e1e37d71c9655144b6c8ff328c0d8d48" exitCode=0 Dec 01 15:13:54 crc kubenswrapper[4931]: I1201 15:13:54.860042 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7v66" event={"ID":"6b10c645-d72b-4fe8-8ae0-9f169326fed4","Type":"ContainerDied","Data":"3663313b69f218d7c977c0c7f5f60e08e1e37d71c9655144b6c8ff328c0d8d48"} Dec 01 15:13:55 crc kubenswrapper[4931]: I1201 15:13:55.975406 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.027366 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b10c645-d72b-4fe8-8ae0-9f169326fed4-catalog-content\") pod \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\" (UID: \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\") " Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.027482 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbss8\" (UniqueName: \"kubernetes.io/projected/6b10c645-d72b-4fe8-8ae0-9f169326fed4-kube-api-access-mbss8\") pod \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\" (UID: \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\") " Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.027511 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b10c645-d72b-4fe8-8ae0-9f169326fed4-utilities\") pod \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\" (UID: \"6b10c645-d72b-4fe8-8ae0-9f169326fed4\") " Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.028504 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b10c645-d72b-4fe8-8ae0-9f169326fed4-utilities" (OuterVolumeSpecName: "utilities") pod "6b10c645-d72b-4fe8-8ae0-9f169326fed4" (UID: "6b10c645-d72b-4fe8-8ae0-9f169326fed4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.033563 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b10c645-d72b-4fe8-8ae0-9f169326fed4-kube-api-access-mbss8" (OuterVolumeSpecName: "kube-api-access-mbss8") pod "6b10c645-d72b-4fe8-8ae0-9f169326fed4" (UID: "6b10c645-d72b-4fe8-8ae0-9f169326fed4"). InnerVolumeSpecName "kube-api-access-mbss8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.126548 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b10c645-d72b-4fe8-8ae0-9f169326fed4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b10c645-d72b-4fe8-8ae0-9f169326fed4" (UID: "6b10c645-d72b-4fe8-8ae0-9f169326fed4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.128847 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbss8\" (UniqueName: \"kubernetes.io/projected/6b10c645-d72b-4fe8-8ae0-9f169326fed4-kube-api-access-mbss8\") on node \"crc\" DevicePath \"\"" Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.128888 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b10c645-d72b-4fe8-8ae0-9f169326fed4-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.128904 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b10c645-d72b-4fe8-8ae0-9f169326fed4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.873906 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7v66" event={"ID":"6b10c645-d72b-4fe8-8ae0-9f169326fed4","Type":"ContainerDied","Data":"8b46837c3ed2f993df087f424818feaee2cc6556c98400d8c57321f57ae1ec18"} Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.874010 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7v66" Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.874303 4931 scope.go:117] "RemoveContainer" containerID="3663313b69f218d7c977c0c7f5f60e08e1e37d71c9655144b6c8ff328c0d8d48" Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.903004 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f7v66"] Dec 01 15:13:56 crc kubenswrapper[4931]: I1201 15:13:56.909823 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f7v66"] Dec 01 15:13:57 crc kubenswrapper[4931]: I1201 15:13:57.317356 4931 scope.go:117] "RemoveContainer" containerID="ec3a47abfaa8a4c128ad6c7f6ca03be931d2adf191614c28cc07575af25e0752" Dec 01 15:13:57 crc kubenswrapper[4931]: I1201 15:13:57.417359 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:57 crc kubenswrapper[4931]: I1201 15:13:57.417896 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:57 crc kubenswrapper[4931]: I1201 15:13:57.440132 4931 scope.go:117] "RemoveContainer" containerID="142a11cf01d0408a03c71b7bee2ff0c05363de8de4b1cd051a8882b290f4de2b" Dec 01 15:13:57 crc kubenswrapper[4931]: I1201 15:13:57.469665 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:57 crc kubenswrapper[4931]: I1201 15:13:57.900153 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4xwr7" event={"ID":"8be2b09a-fed1-4f7f-9424-5ca00a814c3d","Type":"ContainerStarted","Data":"3b7e4cd96376cfdceda2af38fc978e6ccb1c6546ff30704a7a87f5b2fc728e41"} Dec 01 15:13:57 crc kubenswrapper[4931]: I1201 15:13:57.970849 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.261774 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b10c645-d72b-4fe8-8ae0-9f169326fed4" path="/var/lib/kubelet/pods/6b10c645-d72b-4fe8-8ae0-9f169326fed4/volumes" Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.320181 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4xwh"] Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.505061 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h8422"] Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.506695 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h8422" podUID="eda61d02-1a5b-4602-935e-04ded0ccddb5" containerName="registry-server" containerID="cri-o://a4b9956cb181a2830c32ab3f94e9466527da52c9120610868353789ffcedf07d" gracePeriod=2 Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.892824 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.910087 4931 generic.go:334] "Generic (PLEG): container finished" podID="eda61d02-1a5b-4602-935e-04ded0ccddb5" containerID="a4b9956cb181a2830c32ab3f94e9466527da52c9120610868353789ffcedf07d" exitCode=0 Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.910171 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8422" event={"ID":"eda61d02-1a5b-4602-935e-04ded0ccddb5","Type":"ContainerDied","Data":"a4b9956cb181a2830c32ab3f94e9466527da52c9120610868353789ffcedf07d"} Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.910209 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8422" event={"ID":"eda61d02-1a5b-4602-935e-04ded0ccddb5","Type":"ContainerDied","Data":"c1792ea1df3caa0e9cc1068ab3dbacecffc55391dc02ca0b8b602c930aa78cb8"} Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.910233 4931 scope.go:117] "RemoveContainer" containerID="a4b9956cb181a2830c32ab3f94e9466527da52c9120610868353789ffcedf07d" Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.910404 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8422" Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.959334 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8" event={"ID":"f821ad14-f34f-49cb-a884-905cf0219454","Type":"ContainerStarted","Data":"334dc4edb4fc26016d91a1bd5faf22df36bb6f8a7d70d778e2a8f815bb1808cd"} Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.959773 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8" Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.961110 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" event={"ID":"a5b7cbc9-6afb-414c-ac6e-569b6d9634ed","Type":"ContainerStarted","Data":"81dbcbc2cb456c57dc9e08aaa1ee373d772a6010a9860d0a27cca6a63186813e"} Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.963256 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8ljjr" event={"ID":"fd514864-e79c-4cd7-9517-0b3f9fbee078","Type":"ContainerStarted","Data":"1c69656f76b071f3b4153b168a3886ceda4d1de78e8412adcad9d3b3f063e1b5"} Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.963288 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.979843 4931 scope.go:117] "RemoveContainer" containerID="4f7f4a9e58c79d6ec08aee20b5315de947921d810f1cc0f42c030d19c2a292e5" Dec 01 15:13:58 crc kubenswrapper[4931]: I1201 15:13:58.984060 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8" podStartSLOduration=3.239977993 podStartE2EDuration="8.984043724s" podCreationTimestamp="2025-12-01 15:13:50 +0000 UTC" firstStartedPulling="2025-12-01 15:13:51.737571324 +0000 UTC m=+778.163444991" lastFinishedPulling="2025-12-01 15:13:57.481637025 +0000 UTC m=+783.907510722" observedRunningTime="2025-12-01 15:13:58.980076482 +0000 UTC m=+785.405950149" watchObservedRunningTime="2025-12-01 15:13:58.984043724 +0000 UTC m=+785.409917391" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.001730 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-8ljjr" podStartSLOduration=2.809251941 podStartE2EDuration="9.001709425s" podCreationTimestamp="2025-12-01 15:13:50 +0000 UTC" firstStartedPulling="2025-12-01 15:13:51.297567508 +0000 UTC m=+777.723441175" lastFinishedPulling="2025-12-01 15:13:57.490024972 +0000 UTC m=+783.915898659" observedRunningTime="2025-12-01 15:13:58.999122022 +0000 UTC m=+785.424995689" watchObservedRunningTime="2025-12-01 15:13:59.001709425 +0000 UTC m=+785.427583092" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.002458 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda61d02-1a5b-4602-935e-04ded0ccddb5-utilities\") pod \"eda61d02-1a5b-4602-935e-04ded0ccddb5\" (UID: \"eda61d02-1a5b-4602-935e-04ded0ccddb5\") " Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.002691 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda61d02-1a5b-4602-935e-04ded0ccddb5-catalog-content\") pod \"eda61d02-1a5b-4602-935e-04ded0ccddb5\" (UID: \"eda61d02-1a5b-4602-935e-04ded0ccddb5\") " Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.002720 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k99gn\" (UniqueName: \"kubernetes.io/projected/eda61d02-1a5b-4602-935e-04ded0ccddb5-kube-api-access-k99gn\") pod \"eda61d02-1a5b-4602-935e-04ded0ccddb5\" (UID: \"eda61d02-1a5b-4602-935e-04ded0ccddb5\") " Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.005546 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eda61d02-1a5b-4602-935e-04ded0ccddb5-utilities" (OuterVolumeSpecName: "utilities") pod "eda61d02-1a5b-4602-935e-04ded0ccddb5" (UID: "eda61d02-1a5b-4602-935e-04ded0ccddb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.015417 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda61d02-1a5b-4602-935e-04ded0ccddb5-kube-api-access-k99gn" (OuterVolumeSpecName: "kube-api-access-k99gn") pod "eda61d02-1a5b-4602-935e-04ded0ccddb5" (UID: "eda61d02-1a5b-4602-935e-04ded0ccddb5"). InnerVolumeSpecName "kube-api-access-k99gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.038947 4931 scope.go:117] "RemoveContainer" containerID="f24a3b111989b4f765f7cd754426d8e7b2b224bf9ea25793936f5e3085a2534e" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.052646 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-lt2nw" podStartSLOduration=2.210740332 podStartE2EDuration="8.052617668s" podCreationTimestamp="2025-12-01 15:13:51 +0000 UTC" firstStartedPulling="2025-12-01 15:13:51.638078421 +0000 UTC m=+778.063952088" lastFinishedPulling="2025-12-01 15:13:57.479955757 +0000 UTC m=+783.905829424" observedRunningTime="2025-12-01 15:13:59.026349284 +0000 UTC m=+785.452222961" watchObservedRunningTime="2025-12-01 15:13:59.052617668 +0000 UTC m=+785.478491335" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.068159 4931 scope.go:117] "RemoveContainer" containerID="a4b9956cb181a2830c32ab3f94e9466527da52c9120610868353789ffcedf07d" Dec 01 15:13:59 crc kubenswrapper[4931]: E1201 15:13:59.068781 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4b9956cb181a2830c32ab3f94e9466527da52c9120610868353789ffcedf07d\": container with ID starting with a4b9956cb181a2830c32ab3f94e9466527da52c9120610868353789ffcedf07d not found: ID does not exist" containerID="a4b9956cb181a2830c32ab3f94e9466527da52c9120610868353789ffcedf07d" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.068832 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b9956cb181a2830c32ab3f94e9466527da52c9120610868353789ffcedf07d"} err="failed to get container status \"a4b9956cb181a2830c32ab3f94e9466527da52c9120610868353789ffcedf07d\": rpc error: code = NotFound desc = could not find container \"a4b9956cb181a2830c32ab3f94e9466527da52c9120610868353789ffcedf07d\": container with ID starting with a4b9956cb181a2830c32ab3f94e9466527da52c9120610868353789ffcedf07d not found: ID does not exist" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.068862 4931 scope.go:117] "RemoveContainer" containerID="4f7f4a9e58c79d6ec08aee20b5315de947921d810f1cc0f42c030d19c2a292e5" Dec 01 15:13:59 crc kubenswrapper[4931]: E1201 15:13:59.069094 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f7f4a9e58c79d6ec08aee20b5315de947921d810f1cc0f42c030d19c2a292e5\": container with ID starting with 4f7f4a9e58c79d6ec08aee20b5315de947921d810f1cc0f42c030d19c2a292e5 not found: ID does not exist" containerID="4f7f4a9e58c79d6ec08aee20b5315de947921d810f1cc0f42c030d19c2a292e5" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.069125 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f7f4a9e58c79d6ec08aee20b5315de947921d810f1cc0f42c030d19c2a292e5"} err="failed to get container status \"4f7f4a9e58c79d6ec08aee20b5315de947921d810f1cc0f42c030d19c2a292e5\": rpc error: code = NotFound desc = could not find container \"4f7f4a9e58c79d6ec08aee20b5315de947921d810f1cc0f42c030d19c2a292e5\": container with ID starting with 4f7f4a9e58c79d6ec08aee20b5315de947921d810f1cc0f42c030d19c2a292e5 not found: ID does not exist" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.072015 4931 scope.go:117] "RemoveContainer" containerID="f24a3b111989b4f765f7cd754426d8e7b2b224bf9ea25793936f5e3085a2534e" Dec 01 15:13:59 crc kubenswrapper[4931]: E1201 15:13:59.073628 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f24a3b111989b4f765f7cd754426d8e7b2b224bf9ea25793936f5e3085a2534e\": container with ID starting with f24a3b111989b4f765f7cd754426d8e7b2b224bf9ea25793936f5e3085a2534e not found: ID does not exist" containerID="f24a3b111989b4f765f7cd754426d8e7b2b224bf9ea25793936f5e3085a2534e" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.073691 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24a3b111989b4f765f7cd754426d8e7b2b224bf9ea25793936f5e3085a2534e"} err="failed to get container status \"f24a3b111989b4f765f7cd754426d8e7b2b224bf9ea25793936f5e3085a2534e\": rpc error: code = NotFound desc = could not find container \"f24a3b111989b4f765f7cd754426d8e7b2b224bf9ea25793936f5e3085a2534e\": container with ID starting with f24a3b111989b4f765f7cd754426d8e7b2b224bf9ea25793936f5e3085a2534e not found: ID does not exist" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.082739 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eda61d02-1a5b-4602-935e-04ded0ccddb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eda61d02-1a5b-4602-935e-04ded0ccddb5" (UID: "eda61d02-1a5b-4602-935e-04ded0ccddb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.104940 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eda61d02-1a5b-4602-935e-04ded0ccddb5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.104980 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k99gn\" (UniqueName: \"kubernetes.io/projected/eda61d02-1a5b-4602-935e-04ded0ccddb5-kube-api-access-k99gn\") on node \"crc\" DevicePath \"\"" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.104993 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eda61d02-1a5b-4602-935e-04ded0ccddb5-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.239970 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h8422"] Dec 01 15:13:59 crc kubenswrapper[4931]: I1201 15:13:59.251335 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h8422"] Dec 01 15:14:00 crc kubenswrapper[4931]: I1201 15:14:00.250430 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eda61d02-1a5b-4602-935e-04ded0ccddb5" path="/var/lib/kubelet/pods/eda61d02-1a5b-4602-935e-04ded0ccddb5/volumes" Dec 01 15:14:01 crc kubenswrapper[4931]: I1201 15:14:01.567511 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:14:01 crc kubenswrapper[4931]: I1201 15:14:01.568186 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:14:01 crc kubenswrapper[4931]: I1201 15:14:01.573867 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:14:02 crc kubenswrapper[4931]: I1201 15:14:02.004060 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4xwr7" event={"ID":"8be2b09a-fed1-4f7f-9424-5ca00a814c3d","Type":"ContainerStarted","Data":"7b58c678dfe82d727cda113d451fae8781e8756fe0dbfd2f5b6ee56d285e3bfe"} Dec 01 15:14:02 crc kubenswrapper[4931]: I1201 15:14:02.012588 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-858544c664-mmdhk" Dec 01 15:14:02 crc kubenswrapper[4931]: I1201 15:14:02.038032 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4xwr7" podStartSLOduration=2.723783212 podStartE2EDuration="12.038001865s" podCreationTimestamp="2025-12-01 15:13:50 +0000 UTC" firstStartedPulling="2025-12-01 15:13:51.710317203 +0000 UTC m=+778.136190870" lastFinishedPulling="2025-12-01 15:14:01.024535856 +0000 UTC m=+787.450409523" observedRunningTime="2025-12-01 15:14:02.032041527 +0000 UTC m=+788.457915234" watchObservedRunningTime="2025-12-01 15:14:02.038001865 +0000 UTC m=+788.463875572" Dec 01 15:14:02 crc kubenswrapper[4931]: I1201 15:14:02.147877 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-drngn"] Dec 01 15:14:06 crc kubenswrapper[4931]: I1201 15:14:06.294514 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-8ljjr" Dec 01 15:14:11 crc kubenswrapper[4931]: I1201 15:14:11.263426 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-5xlg8" Dec 01 15:14:19 crc kubenswrapper[4931]: I1201 15:14:19.872040 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:14:19 crc kubenswrapper[4931]: I1201 15:14:19.872924 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.561838 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds"] Dec 01 15:14:24 crc kubenswrapper[4931]: E1201 15:14:24.562576 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b10c645-d72b-4fe8-8ae0-9f169326fed4" containerName="registry-server" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.562590 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b10c645-d72b-4fe8-8ae0-9f169326fed4" containerName="registry-server" Dec 01 15:14:24 crc kubenswrapper[4931]: E1201 15:14:24.562599 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b10c645-d72b-4fe8-8ae0-9f169326fed4" containerName="extract-utilities" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.562604 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b10c645-d72b-4fe8-8ae0-9f169326fed4" containerName="extract-utilities" Dec 01 15:14:24 crc kubenswrapper[4931]: E1201 15:14:24.562618 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda61d02-1a5b-4602-935e-04ded0ccddb5" containerName="extract-content" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.562624 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda61d02-1a5b-4602-935e-04ded0ccddb5" containerName="extract-content" Dec 01 15:14:24 crc kubenswrapper[4931]: E1201 15:14:24.562630 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b10c645-d72b-4fe8-8ae0-9f169326fed4" containerName="extract-content" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.562636 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b10c645-d72b-4fe8-8ae0-9f169326fed4" containerName="extract-content" Dec 01 15:14:24 crc kubenswrapper[4931]: E1201 15:14:24.562648 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda61d02-1a5b-4602-935e-04ded0ccddb5" containerName="extract-utilities" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.562654 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda61d02-1a5b-4602-935e-04ded0ccddb5" containerName="extract-utilities" Dec 01 15:14:24 crc kubenswrapper[4931]: E1201 15:14:24.562662 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda61d02-1a5b-4602-935e-04ded0ccddb5" containerName="registry-server" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.562667 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda61d02-1a5b-4602-935e-04ded0ccddb5" containerName="registry-server" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.562763 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b10c645-d72b-4fe8-8ae0-9f169326fed4" containerName="registry-server" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.562778 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda61d02-1a5b-4602-935e-04ded0ccddb5" containerName="registry-server" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.563574 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.565993 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.574204 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds"] Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.620122 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2d91777-625a-4229-90cb-820f107037e5-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds\" (UID: \"f2d91777-625a-4229-90cb-820f107037e5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.620178 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2d91777-625a-4229-90cb-820f107037e5-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds\" (UID: \"f2d91777-625a-4229-90cb-820f107037e5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.620200 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4wdm\" (UniqueName: \"kubernetes.io/projected/f2d91777-625a-4229-90cb-820f107037e5-kube-api-access-p4wdm\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds\" (UID: \"f2d91777-625a-4229-90cb-820f107037e5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.721417 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2d91777-625a-4229-90cb-820f107037e5-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds\" (UID: \"f2d91777-625a-4229-90cb-820f107037e5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.721559 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2d91777-625a-4229-90cb-820f107037e5-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds\" (UID: \"f2d91777-625a-4229-90cb-820f107037e5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.721607 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4wdm\" (UniqueName: \"kubernetes.io/projected/f2d91777-625a-4229-90cb-820f107037e5-kube-api-access-p4wdm\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds\" (UID: \"f2d91777-625a-4229-90cb-820f107037e5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.722352 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2d91777-625a-4229-90cb-820f107037e5-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds\" (UID: \"f2d91777-625a-4229-90cb-820f107037e5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.722407 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2d91777-625a-4229-90cb-820f107037e5-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds\" (UID: \"f2d91777-625a-4229-90cb-820f107037e5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.744729 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4wdm\" (UniqueName: \"kubernetes.io/projected/f2d91777-625a-4229-90cb-820f107037e5-kube-api-access-p4wdm\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds\" (UID: \"f2d91777-625a-4229-90cb-820f107037e5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" Dec 01 15:14:24 crc kubenswrapper[4931]: I1201 15:14:24.881631 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" Dec 01 15:14:25 crc kubenswrapper[4931]: I1201 15:14:25.114099 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds"] Dec 01 15:14:25 crc kubenswrapper[4931]: W1201 15:14:25.123800 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2d91777_625a_4229_90cb_820f107037e5.slice/crio-8b98f569c3156137e8ecdb3c51b8be989b214cdf5522d8729c44d5af7f1106c7 WatchSource:0}: Error finding container 8b98f569c3156137e8ecdb3c51b8be989b214cdf5522d8729c44d5af7f1106c7: Status 404 returned error can't find the container with id 8b98f569c3156137e8ecdb3c51b8be989b214cdf5522d8729c44d5af7f1106c7 Dec 01 15:14:25 crc kubenswrapper[4931]: I1201 15:14:25.163077 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" event={"ID":"f2d91777-625a-4229-90cb-820f107037e5","Type":"ContainerStarted","Data":"8b98f569c3156137e8ecdb3c51b8be989b214cdf5522d8729c44d5af7f1106c7"} Dec 01 15:14:26 crc kubenswrapper[4931]: I1201 15:14:26.171717 4931 generic.go:334] "Generic (PLEG): container finished" podID="f2d91777-625a-4229-90cb-820f107037e5" containerID="4fb13e1e2579532557ece36f5d0e15cdfc97bba6d3b5102c97688750e8f8fc61" exitCode=0 Dec 01 15:14:26 crc kubenswrapper[4931]: I1201 15:14:26.171969 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" event={"ID":"f2d91777-625a-4229-90cb-820f107037e5","Type":"ContainerDied","Data":"4fb13e1e2579532557ece36f5d0e15cdfc97bba6d3b5102c97688750e8f8fc61"} Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.196467 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-drngn" podUID="c907e960-f833-4546-89df-491334c4fe72" containerName="console" containerID="cri-o://71c10af47b1f2d94994301507ed5c9a83c1c5953a1925a178cf02e8519423d77" gracePeriod=15 Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.615303 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-drngn_c907e960-f833-4546-89df-491334c4fe72/console/0.log" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.615611 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.769617 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c907e960-f833-4546-89df-491334c4fe72-console-oauth-config\") pod \"c907e960-f833-4546-89df-491334c4fe72\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.769936 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c907e960-f833-4546-89df-491334c4fe72-console-serving-cert\") pod \"c907e960-f833-4546-89df-491334c4fe72\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.769956 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-service-ca\") pod \"c907e960-f833-4546-89df-491334c4fe72\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.769994 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-oauth-serving-cert\") pod \"c907e960-f833-4546-89df-491334c4fe72\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.770031 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mswth\" (UniqueName: \"kubernetes.io/projected/c907e960-f833-4546-89df-491334c4fe72-kube-api-access-mswth\") pod \"c907e960-f833-4546-89df-491334c4fe72\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.770056 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-console-config\") pod \"c907e960-f833-4546-89df-491334c4fe72\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.770076 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-trusted-ca-bundle\") pod \"c907e960-f833-4546-89df-491334c4fe72\" (UID: \"c907e960-f833-4546-89df-491334c4fe72\") " Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.770949 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c907e960-f833-4546-89df-491334c4fe72" (UID: "c907e960-f833-4546-89df-491334c4fe72"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.770991 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c907e960-f833-4546-89df-491334c4fe72" (UID: "c907e960-f833-4546-89df-491334c4fe72"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.770984 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-console-config" (OuterVolumeSpecName: "console-config") pod "c907e960-f833-4546-89df-491334c4fe72" (UID: "c907e960-f833-4546-89df-491334c4fe72"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.771041 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-service-ca" (OuterVolumeSpecName: "service-ca") pod "c907e960-f833-4546-89df-491334c4fe72" (UID: "c907e960-f833-4546-89df-491334c4fe72"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.775642 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c907e960-f833-4546-89df-491334c4fe72-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c907e960-f833-4546-89df-491334c4fe72" (UID: "c907e960-f833-4546-89df-491334c4fe72"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.775679 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c907e960-f833-4546-89df-491334c4fe72-kube-api-access-mswth" (OuterVolumeSpecName: "kube-api-access-mswth") pod "c907e960-f833-4546-89df-491334c4fe72" (UID: "c907e960-f833-4546-89df-491334c4fe72"). InnerVolumeSpecName "kube-api-access-mswth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.782650 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c907e960-f833-4546-89df-491334c4fe72-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c907e960-f833-4546-89df-491334c4fe72" (UID: "c907e960-f833-4546-89df-491334c4fe72"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.871985 4931 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c907e960-f833-4546-89df-491334c4fe72-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.872034 4931 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c907e960-f833-4546-89df-491334c4fe72-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.872047 4931 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.872060 4931 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.872074 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mswth\" (UniqueName: \"kubernetes.io/projected/c907e960-f833-4546-89df-491334c4fe72-kube-api-access-mswth\") on node \"crc\" DevicePath \"\"" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.872085 4931 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:14:27 crc kubenswrapper[4931]: I1201 15:14:27.872092 4931 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c907e960-f833-4546-89df-491334c4fe72-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:14:28 crc kubenswrapper[4931]: I1201 15:14:28.204131 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-drngn_c907e960-f833-4546-89df-491334c4fe72/console/0.log" Dec 01 15:14:28 crc kubenswrapper[4931]: I1201 15:14:28.204201 4931 generic.go:334] "Generic (PLEG): container finished" podID="c907e960-f833-4546-89df-491334c4fe72" containerID="71c10af47b1f2d94994301507ed5c9a83c1c5953a1925a178cf02e8519423d77" exitCode=2 Dec 01 15:14:28 crc kubenswrapper[4931]: I1201 15:14:28.204286 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-drngn" Dec 01 15:14:28 crc kubenswrapper[4931]: I1201 15:14:28.204283 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-drngn" event={"ID":"c907e960-f833-4546-89df-491334c4fe72","Type":"ContainerDied","Data":"71c10af47b1f2d94994301507ed5c9a83c1c5953a1925a178cf02e8519423d77"} Dec 01 15:14:28 crc kubenswrapper[4931]: I1201 15:14:28.204422 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-drngn" event={"ID":"c907e960-f833-4546-89df-491334c4fe72","Type":"ContainerDied","Data":"5ce858f2ca74a69d3d395cc8649134ec0be1dfab849d5a2c2c3057360cc2e295"} Dec 01 15:14:28 crc kubenswrapper[4931]: I1201 15:14:28.204443 4931 scope.go:117] "RemoveContainer" containerID="71c10af47b1f2d94994301507ed5c9a83c1c5953a1925a178cf02e8519423d77" Dec 01 15:14:28 crc kubenswrapper[4931]: I1201 15:14:28.207666 4931 generic.go:334] "Generic (PLEG): container finished" podID="f2d91777-625a-4229-90cb-820f107037e5" containerID="b68c4fb4c16b208038709e1380d44442cedce4cd42b1c0cb47fff369aefe4611" exitCode=0 Dec 01 15:14:28 crc kubenswrapper[4931]: I1201 15:14:28.207709 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" event={"ID":"f2d91777-625a-4229-90cb-820f107037e5","Type":"ContainerDied","Data":"b68c4fb4c16b208038709e1380d44442cedce4cd42b1c0cb47fff369aefe4611"} Dec 01 15:14:28 crc kubenswrapper[4931]: I1201 15:14:28.246475 4931 scope.go:117] "RemoveContainer" containerID="71c10af47b1f2d94994301507ed5c9a83c1c5953a1925a178cf02e8519423d77" Dec 01 15:14:28 crc kubenswrapper[4931]: E1201 15:14:28.247139 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71c10af47b1f2d94994301507ed5c9a83c1c5953a1925a178cf02e8519423d77\": container with ID starting with 71c10af47b1f2d94994301507ed5c9a83c1c5953a1925a178cf02e8519423d77 not found: ID does not exist" containerID="71c10af47b1f2d94994301507ed5c9a83c1c5953a1925a178cf02e8519423d77" Dec 01 15:14:28 crc kubenswrapper[4931]: I1201 15:14:28.247185 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71c10af47b1f2d94994301507ed5c9a83c1c5953a1925a178cf02e8519423d77"} err="failed to get container status \"71c10af47b1f2d94994301507ed5c9a83c1c5953a1925a178cf02e8519423d77\": rpc error: code = NotFound desc = could not find container \"71c10af47b1f2d94994301507ed5c9a83c1c5953a1925a178cf02e8519423d77\": container with ID starting with 71c10af47b1f2d94994301507ed5c9a83c1c5953a1925a178cf02e8519423d77 not found: ID does not exist" Dec 01 15:14:28 crc kubenswrapper[4931]: I1201 15:14:28.251247 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-drngn"] Dec 01 15:14:28 crc kubenswrapper[4931]: I1201 15:14:28.256961 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-drngn"] Dec 01 15:14:29 crc kubenswrapper[4931]: I1201 15:14:29.217541 4931 generic.go:334] "Generic (PLEG): container finished" podID="f2d91777-625a-4229-90cb-820f107037e5" containerID="4581b645085639cc32dbba9a0d197e586d9a982dcaee6cd88ea7d9c904daf828" exitCode=0 Dec 01 15:14:29 crc kubenswrapper[4931]: I1201 15:14:29.217827 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" event={"ID":"f2d91777-625a-4229-90cb-820f107037e5","Type":"ContainerDied","Data":"4581b645085639cc32dbba9a0d197e586d9a982dcaee6cd88ea7d9c904daf828"} Dec 01 15:14:30 crc kubenswrapper[4931]: I1201 15:14:30.263136 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c907e960-f833-4546-89df-491334c4fe72" path="/var/lib/kubelet/pods/c907e960-f833-4546-89df-491334c4fe72/volumes" Dec 01 15:14:30 crc kubenswrapper[4931]: I1201 15:14:30.607584 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" Dec 01 15:14:30 crc kubenswrapper[4931]: I1201 15:14:30.717244 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2d91777-625a-4229-90cb-820f107037e5-util\") pod \"f2d91777-625a-4229-90cb-820f107037e5\" (UID: \"f2d91777-625a-4229-90cb-820f107037e5\") " Dec 01 15:14:30 crc kubenswrapper[4931]: I1201 15:14:30.717370 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4wdm\" (UniqueName: \"kubernetes.io/projected/f2d91777-625a-4229-90cb-820f107037e5-kube-api-access-p4wdm\") pod \"f2d91777-625a-4229-90cb-820f107037e5\" (UID: \"f2d91777-625a-4229-90cb-820f107037e5\") " Dec 01 15:14:30 crc kubenswrapper[4931]: I1201 15:14:30.717468 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2d91777-625a-4229-90cb-820f107037e5-bundle\") pod \"f2d91777-625a-4229-90cb-820f107037e5\" (UID: \"f2d91777-625a-4229-90cb-820f107037e5\") " Dec 01 15:14:30 crc kubenswrapper[4931]: I1201 15:14:30.719466 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2d91777-625a-4229-90cb-820f107037e5-bundle" (OuterVolumeSpecName: "bundle") pod "f2d91777-625a-4229-90cb-820f107037e5" (UID: "f2d91777-625a-4229-90cb-820f107037e5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:14:30 crc kubenswrapper[4931]: I1201 15:14:30.726647 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2d91777-625a-4229-90cb-820f107037e5-kube-api-access-p4wdm" (OuterVolumeSpecName: "kube-api-access-p4wdm") pod "f2d91777-625a-4229-90cb-820f107037e5" (UID: "f2d91777-625a-4229-90cb-820f107037e5"). InnerVolumeSpecName "kube-api-access-p4wdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:14:30 crc kubenswrapper[4931]: I1201 15:14:30.798315 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2d91777-625a-4229-90cb-820f107037e5-util" (OuterVolumeSpecName: "util") pod "f2d91777-625a-4229-90cb-820f107037e5" (UID: "f2d91777-625a-4229-90cb-820f107037e5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:14:30 crc kubenswrapper[4931]: I1201 15:14:30.819152 4931 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2d91777-625a-4229-90cb-820f107037e5-util\") on node \"crc\" DevicePath \"\"" Dec 01 15:14:30 crc kubenswrapper[4931]: I1201 15:14:30.819205 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4wdm\" (UniqueName: \"kubernetes.io/projected/f2d91777-625a-4229-90cb-820f107037e5-kube-api-access-p4wdm\") on node \"crc\" DevicePath \"\"" Dec 01 15:14:30 crc kubenswrapper[4931]: I1201 15:14:30.819219 4931 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2d91777-625a-4229-90cb-820f107037e5-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:14:31 crc kubenswrapper[4931]: I1201 15:14:31.240362 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" event={"ID":"f2d91777-625a-4229-90cb-820f107037e5","Type":"ContainerDied","Data":"8b98f569c3156137e8ecdb3c51b8be989b214cdf5522d8729c44d5af7f1106c7"} Dec 01 15:14:31 crc kubenswrapper[4931]: I1201 15:14:31.240454 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds" Dec 01 15:14:31 crc kubenswrapper[4931]: I1201 15:14:31.240488 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b98f569c3156137e8ecdb3c51b8be989b214cdf5522d8729c44d5af7f1106c7" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.449173 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh"] Dec 01 15:14:39 crc kubenswrapper[4931]: E1201 15:14:39.450016 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d91777-625a-4229-90cb-820f107037e5" containerName="util" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.450033 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d91777-625a-4229-90cb-820f107037e5" containerName="util" Dec 01 15:14:39 crc kubenswrapper[4931]: E1201 15:14:39.450043 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d91777-625a-4229-90cb-820f107037e5" containerName="pull" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.450049 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d91777-625a-4229-90cb-820f107037e5" containerName="pull" Dec 01 15:14:39 crc kubenswrapper[4931]: E1201 15:14:39.450065 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d91777-625a-4229-90cb-820f107037e5" containerName="extract" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.450071 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d91777-625a-4229-90cb-820f107037e5" containerName="extract" Dec 01 15:14:39 crc kubenswrapper[4931]: E1201 15:14:39.450085 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c907e960-f833-4546-89df-491334c4fe72" containerName="console" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.450091 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c907e960-f833-4546-89df-491334c4fe72" containerName="console" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.450192 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2d91777-625a-4229-90cb-820f107037e5" containerName="extract" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.450209 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c907e960-f833-4546-89df-491334c4fe72" containerName="console" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.450633 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.461726 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.461740 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.462123 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh"] Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.462274 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.462572 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-6lm47" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.462799 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.548105 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14764be0-1d4c-47d3-9d5e-3682a7857d04-webhook-cert\") pod \"metallb-operator-controller-manager-549c5b5d67-wrmzh\" (UID: \"14764be0-1d4c-47d3-9d5e-3682a7857d04\") " pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.548159 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpt77\" (UniqueName: \"kubernetes.io/projected/14764be0-1d4c-47d3-9d5e-3682a7857d04-kube-api-access-lpt77\") pod \"metallb-operator-controller-manager-549c5b5d67-wrmzh\" (UID: \"14764be0-1d4c-47d3-9d5e-3682a7857d04\") " pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.548196 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14764be0-1d4c-47d3-9d5e-3682a7857d04-apiservice-cert\") pod \"metallb-operator-controller-manager-549c5b5d67-wrmzh\" (UID: \"14764be0-1d4c-47d3-9d5e-3682a7857d04\") " pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.649378 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14764be0-1d4c-47d3-9d5e-3682a7857d04-webhook-cert\") pod \"metallb-operator-controller-manager-549c5b5d67-wrmzh\" (UID: \"14764be0-1d4c-47d3-9d5e-3682a7857d04\") " pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.649449 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpt77\" (UniqueName: \"kubernetes.io/projected/14764be0-1d4c-47d3-9d5e-3682a7857d04-kube-api-access-lpt77\") pod \"metallb-operator-controller-manager-549c5b5d67-wrmzh\" (UID: \"14764be0-1d4c-47d3-9d5e-3682a7857d04\") " pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.649478 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14764be0-1d4c-47d3-9d5e-3682a7857d04-apiservice-cert\") pod \"metallb-operator-controller-manager-549c5b5d67-wrmzh\" (UID: \"14764be0-1d4c-47d3-9d5e-3682a7857d04\") " pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.655767 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14764be0-1d4c-47d3-9d5e-3682a7857d04-webhook-cert\") pod \"metallb-operator-controller-manager-549c5b5d67-wrmzh\" (UID: \"14764be0-1d4c-47d3-9d5e-3682a7857d04\") " pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.656071 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14764be0-1d4c-47d3-9d5e-3682a7857d04-apiservice-cert\") pod \"metallb-operator-controller-manager-549c5b5d67-wrmzh\" (UID: \"14764be0-1d4c-47d3-9d5e-3682a7857d04\") " pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.668828 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpt77\" (UniqueName: \"kubernetes.io/projected/14764be0-1d4c-47d3-9d5e-3682a7857d04-kube-api-access-lpt77\") pod \"metallb-operator-controller-manager-549c5b5d67-wrmzh\" (UID: \"14764be0-1d4c-47d3-9d5e-3682a7857d04\") " pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.808766 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.881191 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9"] Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.881958 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.885148 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.885168 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.885251 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-lkwbj" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.914306 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9"] Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.952879 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/63d2099f-ba55-4f55-97ac-bca52404a30a-apiservice-cert\") pod \"metallb-operator-webhook-server-dbbf79d98-cbdx9\" (UID: \"63d2099f-ba55-4f55-97ac-bca52404a30a\") " pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.952969 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/63d2099f-ba55-4f55-97ac-bca52404a30a-webhook-cert\") pod \"metallb-operator-webhook-server-dbbf79d98-cbdx9\" (UID: \"63d2099f-ba55-4f55-97ac-bca52404a30a\") " pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" Dec 01 15:14:39 crc kubenswrapper[4931]: I1201 15:14:39.953000 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxrt6\" (UniqueName: \"kubernetes.io/projected/63d2099f-ba55-4f55-97ac-bca52404a30a-kube-api-access-lxrt6\") pod \"metallb-operator-webhook-server-dbbf79d98-cbdx9\" (UID: \"63d2099f-ba55-4f55-97ac-bca52404a30a\") " pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" Dec 01 15:14:40 crc kubenswrapper[4931]: I1201 15:14:40.055302 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/63d2099f-ba55-4f55-97ac-bca52404a30a-apiservice-cert\") pod \"metallb-operator-webhook-server-dbbf79d98-cbdx9\" (UID: \"63d2099f-ba55-4f55-97ac-bca52404a30a\") " pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" Dec 01 15:14:40 crc kubenswrapper[4931]: I1201 15:14:40.055634 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/63d2099f-ba55-4f55-97ac-bca52404a30a-webhook-cert\") pod \"metallb-operator-webhook-server-dbbf79d98-cbdx9\" (UID: \"63d2099f-ba55-4f55-97ac-bca52404a30a\") " pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" Dec 01 15:14:40 crc kubenswrapper[4931]: I1201 15:14:40.055656 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxrt6\" (UniqueName: \"kubernetes.io/projected/63d2099f-ba55-4f55-97ac-bca52404a30a-kube-api-access-lxrt6\") pod \"metallb-operator-webhook-server-dbbf79d98-cbdx9\" (UID: \"63d2099f-ba55-4f55-97ac-bca52404a30a\") " pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" Dec 01 15:14:40 crc kubenswrapper[4931]: I1201 15:14:40.071052 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/63d2099f-ba55-4f55-97ac-bca52404a30a-webhook-cert\") pod \"metallb-operator-webhook-server-dbbf79d98-cbdx9\" (UID: \"63d2099f-ba55-4f55-97ac-bca52404a30a\") " pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" Dec 01 15:14:40 crc kubenswrapper[4931]: I1201 15:14:40.075126 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/63d2099f-ba55-4f55-97ac-bca52404a30a-apiservice-cert\") pod \"metallb-operator-webhook-server-dbbf79d98-cbdx9\" (UID: \"63d2099f-ba55-4f55-97ac-bca52404a30a\") " pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" Dec 01 15:14:40 crc kubenswrapper[4931]: I1201 15:14:40.083006 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxrt6\" (UniqueName: \"kubernetes.io/projected/63d2099f-ba55-4f55-97ac-bca52404a30a-kube-api-access-lxrt6\") pod \"metallb-operator-webhook-server-dbbf79d98-cbdx9\" (UID: \"63d2099f-ba55-4f55-97ac-bca52404a30a\") " pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" Dec 01 15:14:40 crc kubenswrapper[4931]: I1201 15:14:40.201348 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" Dec 01 15:14:40 crc kubenswrapper[4931]: I1201 15:14:40.373753 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh"] Dec 01 15:14:40 crc kubenswrapper[4931]: I1201 15:14:40.723941 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9"] Dec 01 15:14:41 crc kubenswrapper[4931]: I1201 15:14:41.311028 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" event={"ID":"14764be0-1d4c-47d3-9d5e-3682a7857d04","Type":"ContainerStarted","Data":"7031a865e40751607548662cf269efecd799e78abef1ed9141c9068e0b3b9825"} Dec 01 15:14:41 crc kubenswrapper[4931]: I1201 15:14:41.312687 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" event={"ID":"63d2099f-ba55-4f55-97ac-bca52404a30a","Type":"ContainerStarted","Data":"4c3258c2142ca2bf6286b92ff51179d6f52e89d93d2d1b4617a082fafd6ed7ac"} Dec 01 15:14:44 crc kubenswrapper[4931]: I1201 15:14:44.357261 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" event={"ID":"14764be0-1d4c-47d3-9d5e-3682a7857d04","Type":"ContainerStarted","Data":"1d4d1cdcd46a04a0b8d10a45265ada354cf307aa143c7335298a8c460778935c"} Dec 01 15:14:44 crc kubenswrapper[4931]: I1201 15:14:44.359032 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" Dec 01 15:14:44 crc kubenswrapper[4931]: I1201 15:14:44.379604 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" podStartSLOduration=1.9143381640000001 podStartE2EDuration="5.379579295s" podCreationTimestamp="2025-12-01 15:14:39 +0000 UTC" firstStartedPulling="2025-12-01 15:14:40.383560348 +0000 UTC m=+826.809434015" lastFinishedPulling="2025-12-01 15:14:43.848801479 +0000 UTC m=+830.274675146" observedRunningTime="2025-12-01 15:14:44.377535977 +0000 UTC m=+830.803409644" watchObservedRunningTime="2025-12-01 15:14:44.379579295 +0000 UTC m=+830.805452962" Dec 01 15:14:46 crc kubenswrapper[4931]: I1201 15:14:46.374210 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" event={"ID":"63d2099f-ba55-4f55-97ac-bca52404a30a","Type":"ContainerStarted","Data":"2de32988fd6ad4b89569e345fee8c25a41aa34be2eb543c4c715138ce50c3c58"} Dec 01 15:14:46 crc kubenswrapper[4931]: I1201 15:14:46.374737 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" Dec 01 15:14:46 crc kubenswrapper[4931]: I1201 15:14:46.407345 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" podStartSLOduration=2.384843252 podStartE2EDuration="7.407322716s" podCreationTimestamp="2025-12-01 15:14:39 +0000 UTC" firstStartedPulling="2025-12-01 15:14:40.740558918 +0000 UTC m=+827.166432585" lastFinishedPulling="2025-12-01 15:14:45.763038382 +0000 UTC m=+832.188912049" observedRunningTime="2025-12-01 15:14:46.402388406 +0000 UTC m=+832.828262073" watchObservedRunningTime="2025-12-01 15:14:46.407322716 +0000 UTC m=+832.833196383" Dec 01 15:14:49 crc kubenswrapper[4931]: I1201 15:14:49.872344 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:14:49 crc kubenswrapper[4931]: I1201 15:14:49.872860 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:14:49 crc kubenswrapper[4931]: I1201 15:14:49.872925 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:14:49 crc kubenswrapper[4931]: I1201 15:14:49.873735 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c593bc454b5d325cbd0967c1c5d7f0f229621585e06f8319b965d66c0d93b5d"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 15:14:49 crc kubenswrapper[4931]: I1201 15:14:49.873808 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://2c593bc454b5d325cbd0967c1c5d7f0f229621585e06f8319b965d66c0d93b5d" gracePeriod=600 Dec 01 15:14:50 crc kubenswrapper[4931]: I1201 15:14:50.413443 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="2c593bc454b5d325cbd0967c1c5d7f0f229621585e06f8319b965d66c0d93b5d" exitCode=0 Dec 01 15:14:50 crc kubenswrapper[4931]: I1201 15:14:50.413565 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"2c593bc454b5d325cbd0967c1c5d7f0f229621585e06f8319b965d66c0d93b5d"} Dec 01 15:14:50 crc kubenswrapper[4931]: I1201 15:14:50.413878 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"4f551f2eb27cc6e8158d6be30d6ee18e92fc02ccf79b3c9e4d9f5dcf4740103b"} Dec 01 15:14:50 crc kubenswrapper[4931]: I1201 15:14:50.413922 4931 scope.go:117] "RemoveContainer" containerID="edf71cb1bce5d3aeba5997d04dcf4aa6526fc61ac3e3b1daeb2f121fedfeeabd" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.194376 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk"] Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.195543 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.197569 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.203273 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.210641 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-dbbf79d98-cbdx9" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.212284 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk"] Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.287304 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzjrw\" (UniqueName: \"kubernetes.io/projected/269c4450-aefe-45b5-b753-0be7e90e8af6-kube-api-access-rzjrw\") pod \"collect-profiles-29410035-6dclk\" (UID: \"269c4450-aefe-45b5-b753-0be7e90e8af6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.287460 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/269c4450-aefe-45b5-b753-0be7e90e8af6-secret-volume\") pod \"collect-profiles-29410035-6dclk\" (UID: \"269c4450-aefe-45b5-b753-0be7e90e8af6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.287529 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/269c4450-aefe-45b5-b753-0be7e90e8af6-config-volume\") pod \"collect-profiles-29410035-6dclk\" (UID: \"269c4450-aefe-45b5-b753-0be7e90e8af6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.389088 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/269c4450-aefe-45b5-b753-0be7e90e8af6-config-volume\") pod \"collect-profiles-29410035-6dclk\" (UID: \"269c4450-aefe-45b5-b753-0be7e90e8af6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.389190 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzjrw\" (UniqueName: \"kubernetes.io/projected/269c4450-aefe-45b5-b753-0be7e90e8af6-kube-api-access-rzjrw\") pod \"collect-profiles-29410035-6dclk\" (UID: \"269c4450-aefe-45b5-b753-0be7e90e8af6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.389244 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/269c4450-aefe-45b5-b753-0be7e90e8af6-secret-volume\") pod \"collect-profiles-29410035-6dclk\" (UID: \"269c4450-aefe-45b5-b753-0be7e90e8af6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.390003 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/269c4450-aefe-45b5-b753-0be7e90e8af6-config-volume\") pod \"collect-profiles-29410035-6dclk\" (UID: \"269c4450-aefe-45b5-b753-0be7e90e8af6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.399198 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/269c4450-aefe-45b5-b753-0be7e90e8af6-secret-volume\") pod \"collect-profiles-29410035-6dclk\" (UID: \"269c4450-aefe-45b5-b753-0be7e90e8af6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.407259 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzjrw\" (UniqueName: \"kubernetes.io/projected/269c4450-aefe-45b5-b753-0be7e90e8af6-kube-api-access-rzjrw\") pod \"collect-profiles-29410035-6dclk\" (UID: \"269c4450-aefe-45b5-b753-0be7e90e8af6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.510955 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" Dec 01 15:15:00 crc kubenswrapper[4931]: I1201 15:15:00.808329 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk"] Dec 01 15:15:00 crc kubenswrapper[4931]: W1201 15:15:00.823707 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod269c4450_aefe_45b5_b753_0be7e90e8af6.slice/crio-57d96702c985ad3e4326cc80324d7895f55e04e64acf0907a4f455f12f049a27 WatchSource:0}: Error finding container 57d96702c985ad3e4326cc80324d7895f55e04e64acf0907a4f455f12f049a27: Status 404 returned error can't find the container with id 57d96702c985ad3e4326cc80324d7895f55e04e64acf0907a4f455f12f049a27 Dec 01 15:15:01 crc kubenswrapper[4931]: I1201 15:15:01.482297 4931 generic.go:334] "Generic (PLEG): container finished" podID="269c4450-aefe-45b5-b753-0be7e90e8af6" containerID="bd6c263e7e46de183945e0ae47b15f6382e2d85efa702a709ac3aa199e922967" exitCode=0 Dec 01 15:15:01 crc kubenswrapper[4931]: I1201 15:15:01.482345 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" event={"ID":"269c4450-aefe-45b5-b753-0be7e90e8af6","Type":"ContainerDied","Data":"bd6c263e7e46de183945e0ae47b15f6382e2d85efa702a709ac3aa199e922967"} Dec 01 15:15:01 crc kubenswrapper[4931]: I1201 15:15:01.482431 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" event={"ID":"269c4450-aefe-45b5-b753-0be7e90e8af6","Type":"ContainerStarted","Data":"57d96702c985ad3e4326cc80324d7895f55e04e64acf0907a4f455f12f049a27"} Dec 01 15:15:02 crc kubenswrapper[4931]: I1201 15:15:02.764138 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" Dec 01 15:15:02 crc kubenswrapper[4931]: I1201 15:15:02.840630 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/269c4450-aefe-45b5-b753-0be7e90e8af6-config-volume\") pod \"269c4450-aefe-45b5-b753-0be7e90e8af6\" (UID: \"269c4450-aefe-45b5-b753-0be7e90e8af6\") " Dec 01 15:15:02 crc kubenswrapper[4931]: I1201 15:15:02.840794 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/269c4450-aefe-45b5-b753-0be7e90e8af6-secret-volume\") pod \"269c4450-aefe-45b5-b753-0be7e90e8af6\" (UID: \"269c4450-aefe-45b5-b753-0be7e90e8af6\") " Dec 01 15:15:02 crc kubenswrapper[4931]: I1201 15:15:02.840971 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzjrw\" (UniqueName: \"kubernetes.io/projected/269c4450-aefe-45b5-b753-0be7e90e8af6-kube-api-access-rzjrw\") pod \"269c4450-aefe-45b5-b753-0be7e90e8af6\" (UID: \"269c4450-aefe-45b5-b753-0be7e90e8af6\") " Dec 01 15:15:02 crc kubenswrapper[4931]: I1201 15:15:02.841691 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269c4450-aefe-45b5-b753-0be7e90e8af6-config-volume" (OuterVolumeSpecName: "config-volume") pod "269c4450-aefe-45b5-b753-0be7e90e8af6" (UID: "269c4450-aefe-45b5-b753-0be7e90e8af6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:15:02 crc kubenswrapper[4931]: I1201 15:15:02.849976 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269c4450-aefe-45b5-b753-0be7e90e8af6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "269c4450-aefe-45b5-b753-0be7e90e8af6" (UID: "269c4450-aefe-45b5-b753-0be7e90e8af6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:15:02 crc kubenswrapper[4931]: I1201 15:15:02.854668 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269c4450-aefe-45b5-b753-0be7e90e8af6-kube-api-access-rzjrw" (OuterVolumeSpecName: "kube-api-access-rzjrw") pod "269c4450-aefe-45b5-b753-0be7e90e8af6" (UID: "269c4450-aefe-45b5-b753-0be7e90e8af6"). InnerVolumeSpecName "kube-api-access-rzjrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:15:02 crc kubenswrapper[4931]: I1201 15:15:02.943164 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzjrw\" (UniqueName: \"kubernetes.io/projected/269c4450-aefe-45b5-b753-0be7e90e8af6-kube-api-access-rzjrw\") on node \"crc\" DevicePath \"\"" Dec 01 15:15:02 crc kubenswrapper[4931]: I1201 15:15:02.943220 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/269c4450-aefe-45b5-b753-0be7e90e8af6-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 15:15:02 crc kubenswrapper[4931]: I1201 15:15:02.943237 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/269c4450-aefe-45b5-b753-0be7e90e8af6-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 15:15:03 crc kubenswrapper[4931]: I1201 15:15:03.500864 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" event={"ID":"269c4450-aefe-45b5-b753-0be7e90e8af6","Type":"ContainerDied","Data":"57d96702c985ad3e4326cc80324d7895f55e04e64acf0907a4f455f12f049a27"} Dec 01 15:15:03 crc kubenswrapper[4931]: I1201 15:15:03.501684 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57d96702c985ad3e4326cc80324d7895f55e04e64acf0907a4f455f12f049a27" Dec 01 15:15:03 crc kubenswrapper[4931]: I1201 15:15:03.500987 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk" Dec 01 15:15:19 crc kubenswrapper[4931]: I1201 15:15:19.812759 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-549c5b5d67-wrmzh" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.625272 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg"] Dec 01 15:15:20 crc kubenswrapper[4931]: E1201 15:15:20.626043 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="269c4450-aefe-45b5-b753-0be7e90e8af6" containerName="collect-profiles" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.626067 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="269c4450-aefe-45b5-b753-0be7e90e8af6" containerName="collect-profiles" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.626216 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="269c4450-aefe-45b5-b753-0be7e90e8af6" containerName="collect-profiles" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.626852 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.629241 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.630233 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-gcsz9" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.632716 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-ch7gc"] Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.636968 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.642517 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg"] Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.645563 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.645629 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.745660 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-qb2rl"] Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.746748 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qb2rl" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.749295 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.751768 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.752809 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-smfvq" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.753025 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.762683 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-jcr8j"] Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.764104 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.765954 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.782123 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-jcr8j"] Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.802937 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d427cc71-929e-42ae-bc96-87360ba8c005-metallb-excludel2\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.802985 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6clwv\" (UniqueName: \"kubernetes.io/projected/d143b724-552d-4530-844e-c8b752b2ffa3-kube-api-access-6clwv\") pod \"controller-f8648f98b-jcr8j\" (UID: \"d143b724-552d-4530-844e-c8b752b2ffa3\") " pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803015 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwwb4\" (UniqueName: \"kubernetes.io/projected/84cf14c6-6a50-4863-98a4-caab8b1f5636-kube-api-access-jwwb4\") pod \"frr-k8s-webhook-server-7fcb986d4-rj9kg\" (UID: \"84cf14c6-6a50-4863-98a4-caab8b1f5636\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803040 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d143b724-552d-4530-844e-c8b752b2ffa3-metrics-certs\") pod \"controller-f8648f98b-jcr8j\" (UID: \"d143b724-552d-4530-844e-c8b752b2ffa3\") " pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803070 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f1e866ab-084b-436d-86bc-97b7a45e8515-frr-startup\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803088 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f1e866ab-084b-436d-86bc-97b7a45e8515-frr-conf\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803109 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f1e866ab-084b-436d-86bc-97b7a45e8515-frr-sockets\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803132 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8kf6\" (UniqueName: \"kubernetes.io/projected/f1e866ab-084b-436d-86bc-97b7a45e8515-kube-api-access-l8kf6\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803152 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f1e866ab-084b-436d-86bc-97b7a45e8515-reloader\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803168 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f1e866ab-084b-436d-86bc-97b7a45e8515-metrics\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803190 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1e866ab-084b-436d-86bc-97b7a45e8515-metrics-certs\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803212 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84cf14c6-6a50-4863-98a4-caab8b1f5636-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-rj9kg\" (UID: \"84cf14c6-6a50-4863-98a4-caab8b1f5636\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803233 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-metrics-certs\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803259 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-memberlist\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803278 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drs2w\" (UniqueName: \"kubernetes.io/projected/d427cc71-929e-42ae-bc96-87360ba8c005-kube-api-access-drs2w\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.803337 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d143b724-552d-4530-844e-c8b752b2ffa3-cert\") pod \"controller-f8648f98b-jcr8j\" (UID: \"d143b724-552d-4530-844e-c8b752b2ffa3\") " pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904583 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d143b724-552d-4530-844e-c8b752b2ffa3-cert\") pod \"controller-f8648f98b-jcr8j\" (UID: \"d143b724-552d-4530-844e-c8b752b2ffa3\") " pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904667 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d427cc71-929e-42ae-bc96-87360ba8c005-metallb-excludel2\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904693 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6clwv\" (UniqueName: \"kubernetes.io/projected/d143b724-552d-4530-844e-c8b752b2ffa3-kube-api-access-6clwv\") pod \"controller-f8648f98b-jcr8j\" (UID: \"d143b724-552d-4530-844e-c8b752b2ffa3\") " pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904727 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwwb4\" (UniqueName: \"kubernetes.io/projected/84cf14c6-6a50-4863-98a4-caab8b1f5636-kube-api-access-jwwb4\") pod \"frr-k8s-webhook-server-7fcb986d4-rj9kg\" (UID: \"84cf14c6-6a50-4863-98a4-caab8b1f5636\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904749 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d143b724-552d-4530-844e-c8b752b2ffa3-metrics-certs\") pod \"controller-f8648f98b-jcr8j\" (UID: \"d143b724-552d-4530-844e-c8b752b2ffa3\") " pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904773 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f1e866ab-084b-436d-86bc-97b7a45e8515-frr-startup\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904807 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f1e866ab-084b-436d-86bc-97b7a45e8515-frr-conf\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904831 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f1e866ab-084b-436d-86bc-97b7a45e8515-frr-sockets\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904861 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8kf6\" (UniqueName: \"kubernetes.io/projected/f1e866ab-084b-436d-86bc-97b7a45e8515-kube-api-access-l8kf6\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904898 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f1e866ab-084b-436d-86bc-97b7a45e8515-reloader\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904917 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f1e866ab-084b-436d-86bc-97b7a45e8515-metrics\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904942 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1e866ab-084b-436d-86bc-97b7a45e8515-metrics-certs\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904966 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84cf14c6-6a50-4863-98a4-caab8b1f5636-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-rj9kg\" (UID: \"84cf14c6-6a50-4863-98a4-caab8b1f5636\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.904989 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-metrics-certs\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.905010 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-memberlist\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.905043 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drs2w\" (UniqueName: \"kubernetes.io/projected/d427cc71-929e-42ae-bc96-87360ba8c005-kube-api-access-drs2w\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:20 crc kubenswrapper[4931]: E1201 15:15:20.905192 4931 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 01 15:15:20 crc kubenswrapper[4931]: E1201 15:15:20.905288 4931 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 15:15:20 crc kubenswrapper[4931]: E1201 15:15:20.905324 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d143b724-552d-4530-844e-c8b752b2ffa3-metrics-certs podName:d143b724-552d-4530-844e-c8b752b2ffa3 nodeName:}" failed. No retries permitted until 2025-12-01 15:15:21.405290134 +0000 UTC m=+867.831163851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d143b724-552d-4530-844e-c8b752b2ffa3-metrics-certs") pod "controller-f8648f98b-jcr8j" (UID: "d143b724-552d-4530-844e-c8b752b2ffa3") : secret "controller-certs-secret" not found Dec 01 15:15:20 crc kubenswrapper[4931]: E1201 15:15:20.905356 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-memberlist podName:d427cc71-929e-42ae-bc96-87360ba8c005 nodeName:}" failed. No retries permitted until 2025-12-01 15:15:21.405343365 +0000 UTC m=+867.831217062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-memberlist") pod "speaker-qb2rl" (UID: "d427cc71-929e-42ae-bc96-87360ba8c005") : secret "metallb-memberlist" not found Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.905790 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f1e866ab-084b-436d-86bc-97b7a45e8515-metrics\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.905884 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f1e866ab-084b-436d-86bc-97b7a45e8515-reloader\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.905994 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f1e866ab-084b-436d-86bc-97b7a45e8515-frr-sockets\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.906053 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d427cc71-929e-42ae-bc96-87360ba8c005-metallb-excludel2\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.906196 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f1e866ab-084b-436d-86bc-97b7a45e8515-frr-startup\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.906457 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f1e866ab-084b-436d-86bc-97b7a45e8515-frr-conf\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.909296 4931 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.914301 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1e866ab-084b-436d-86bc-97b7a45e8515-metrics-certs\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.914496 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84cf14c6-6a50-4863-98a4-caab8b1f5636-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-rj9kg\" (UID: \"84cf14c6-6a50-4863-98a4-caab8b1f5636\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.920581 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-metrics-certs\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.923887 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d143b724-552d-4530-844e-c8b752b2ffa3-cert\") pod \"controller-f8648f98b-jcr8j\" (UID: \"d143b724-552d-4530-844e-c8b752b2ffa3\") " pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.931412 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8kf6\" (UniqueName: \"kubernetes.io/projected/f1e866ab-084b-436d-86bc-97b7a45e8515-kube-api-access-l8kf6\") pod \"frr-k8s-ch7gc\" (UID: \"f1e866ab-084b-436d-86bc-97b7a45e8515\") " pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.932161 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drs2w\" (UniqueName: \"kubernetes.io/projected/d427cc71-929e-42ae-bc96-87360ba8c005-kube-api-access-drs2w\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.935280 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwwb4\" (UniqueName: \"kubernetes.io/projected/84cf14c6-6a50-4863-98a4-caab8b1f5636-kube-api-access-jwwb4\") pod \"frr-k8s-webhook-server-7fcb986d4-rj9kg\" (UID: \"84cf14c6-6a50-4863-98a4-caab8b1f5636\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.936436 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6clwv\" (UniqueName: \"kubernetes.io/projected/d143b724-552d-4530-844e-c8b752b2ffa3-kube-api-access-6clwv\") pod \"controller-f8648f98b-jcr8j\" (UID: \"d143b724-552d-4530-844e-c8b752b2ffa3\") " pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.955860 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg" Dec 01 15:15:20 crc kubenswrapper[4931]: I1201 15:15:20.975105 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:21 crc kubenswrapper[4931]: I1201 15:15:21.247085 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg"] Dec 01 15:15:21 crc kubenswrapper[4931]: W1201 15:15:21.252717 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84cf14c6_6a50_4863_98a4_caab8b1f5636.slice/crio-bc24675563c6172f6f3fc4d9f1a56ead171489d701ef90a2486ae7767e4e3ffe WatchSource:0}: Error finding container bc24675563c6172f6f3fc4d9f1a56ead171489d701ef90a2486ae7767e4e3ffe: Status 404 returned error can't find the container with id bc24675563c6172f6f3fc4d9f1a56ead171489d701ef90a2486ae7767e4e3ffe Dec 01 15:15:21 crc kubenswrapper[4931]: I1201 15:15:21.411028 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-memberlist\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:21 crc kubenswrapper[4931]: I1201 15:15:21.411693 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d143b724-552d-4530-844e-c8b752b2ffa3-metrics-certs\") pod \"controller-f8648f98b-jcr8j\" (UID: \"d143b724-552d-4530-844e-c8b752b2ffa3\") " pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:21 crc kubenswrapper[4931]: E1201 15:15:21.411341 4931 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 15:15:21 crc kubenswrapper[4931]: E1201 15:15:21.411912 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-memberlist podName:d427cc71-929e-42ae-bc96-87360ba8c005 nodeName:}" failed. No retries permitted until 2025-12-01 15:15:22.411890972 +0000 UTC m=+868.837764649 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-memberlist") pod "speaker-qb2rl" (UID: "d427cc71-929e-42ae-bc96-87360ba8c005") : secret "metallb-memberlist" not found Dec 01 15:15:21 crc kubenswrapper[4931]: I1201 15:15:21.419575 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d143b724-552d-4530-844e-c8b752b2ffa3-metrics-certs\") pod \"controller-f8648f98b-jcr8j\" (UID: \"d143b724-552d-4530-844e-c8b752b2ffa3\") " pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:21 crc kubenswrapper[4931]: I1201 15:15:21.651191 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ch7gc" event={"ID":"f1e866ab-084b-436d-86bc-97b7a45e8515","Type":"ContainerStarted","Data":"7fe9d4f9e82da06962b84fe9d9d7f2a97fdfc967a75f96dd70c3c3342abf0b4c"} Dec 01 15:15:21 crc kubenswrapper[4931]: I1201 15:15:21.653667 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg" event={"ID":"84cf14c6-6a50-4863-98a4-caab8b1f5636","Type":"ContainerStarted","Data":"bc24675563c6172f6f3fc4d9f1a56ead171489d701ef90a2486ae7767e4e3ffe"} Dec 01 15:15:21 crc kubenswrapper[4931]: I1201 15:15:21.682759 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:22 crc kubenswrapper[4931]: I1201 15:15:22.256512 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-jcr8j"] Dec 01 15:15:22 crc kubenswrapper[4931]: I1201 15:15:22.432202 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-memberlist\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:22 crc kubenswrapper[4931]: E1201 15:15:22.432408 4931 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 15:15:22 crc kubenswrapper[4931]: E1201 15:15:22.432531 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-memberlist podName:d427cc71-929e-42ae-bc96-87360ba8c005 nodeName:}" failed. No retries permitted until 2025-12-01 15:15:24.432511113 +0000 UTC m=+870.858384780 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-memberlist") pod "speaker-qb2rl" (UID: "d427cc71-929e-42ae-bc96-87360ba8c005") : secret "metallb-memberlist" not found Dec 01 15:15:22 crc kubenswrapper[4931]: I1201 15:15:22.669089 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-jcr8j" event={"ID":"d143b724-552d-4530-844e-c8b752b2ffa3","Type":"ContainerStarted","Data":"26f8709dc207951b74036ae72e6e32ac2a48d7a0e5d9c3be617616c954e15b46"} Dec 01 15:15:22 crc kubenswrapper[4931]: I1201 15:15:22.669156 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-jcr8j" event={"ID":"d143b724-552d-4530-844e-c8b752b2ffa3","Type":"ContainerStarted","Data":"f8c8b012cc3d3d93232d582c7d37e420abd8ed1b12ebc80201ba58570e2421af"} Dec 01 15:15:22 crc kubenswrapper[4931]: I1201 15:15:22.669168 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-jcr8j" event={"ID":"d143b724-552d-4530-844e-c8b752b2ffa3","Type":"ContainerStarted","Data":"0d1e9dba30aeded8e1dba7d14849678b9b0ae51757481ef575fda31af66a6491"} Dec 01 15:15:22 crc kubenswrapper[4931]: I1201 15:15:22.670025 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:24 crc kubenswrapper[4931]: I1201 15:15:24.276773 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-jcr8j" podStartSLOduration=4.276750031 podStartE2EDuration="4.276750031s" podCreationTimestamp="2025-12-01 15:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:15:22.709614853 +0000 UTC m=+869.135488520" watchObservedRunningTime="2025-12-01 15:15:24.276750031 +0000 UTC m=+870.702623698" Dec 01 15:15:24 crc kubenswrapper[4931]: I1201 15:15:24.462802 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-memberlist\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:24 crc kubenswrapper[4931]: I1201 15:15:24.470095 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d427cc71-929e-42ae-bc96-87360ba8c005-memberlist\") pod \"speaker-qb2rl\" (UID: \"d427cc71-929e-42ae-bc96-87360ba8c005\") " pod="metallb-system/speaker-qb2rl" Dec 01 15:15:24 crc kubenswrapper[4931]: I1201 15:15:24.665024 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qb2rl" Dec 01 15:15:24 crc kubenswrapper[4931]: W1201 15:15:24.710375 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd427cc71_929e_42ae_bc96_87360ba8c005.slice/crio-56164a4df3aa2357eee90e5598872b74eba070c1342c6ef72c4cc8b9da78da9b WatchSource:0}: Error finding container 56164a4df3aa2357eee90e5598872b74eba070c1342c6ef72c4cc8b9da78da9b: Status 404 returned error can't find the container with id 56164a4df3aa2357eee90e5598872b74eba070c1342c6ef72c4cc8b9da78da9b Dec 01 15:15:24 crc kubenswrapper[4931]: I1201 15:15:24.890701 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sznp4"] Dec 01 15:15:24 crc kubenswrapper[4931]: I1201 15:15:24.891896 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:24 crc kubenswrapper[4931]: I1201 15:15:24.919142 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sznp4"] Dec 01 15:15:24 crc kubenswrapper[4931]: I1201 15:15:24.970739 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d20085-b2be-451d-a12b-24f075a9b97f-utilities\") pod \"community-operators-sznp4\" (UID: \"45d20085-b2be-451d-a12b-24f075a9b97f\") " pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:24 crc kubenswrapper[4931]: I1201 15:15:24.970798 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d20085-b2be-451d-a12b-24f075a9b97f-catalog-content\") pod \"community-operators-sznp4\" (UID: \"45d20085-b2be-451d-a12b-24f075a9b97f\") " pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:24 crc kubenswrapper[4931]: I1201 15:15:24.970830 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jn2k\" (UniqueName: \"kubernetes.io/projected/45d20085-b2be-451d-a12b-24f075a9b97f-kube-api-access-8jn2k\") pod \"community-operators-sznp4\" (UID: \"45d20085-b2be-451d-a12b-24f075a9b97f\") " pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:25 crc kubenswrapper[4931]: I1201 15:15:25.072096 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jn2k\" (UniqueName: \"kubernetes.io/projected/45d20085-b2be-451d-a12b-24f075a9b97f-kube-api-access-8jn2k\") pod \"community-operators-sznp4\" (UID: \"45d20085-b2be-451d-a12b-24f075a9b97f\") " pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:25 crc kubenswrapper[4931]: I1201 15:15:25.072258 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d20085-b2be-451d-a12b-24f075a9b97f-utilities\") pod \"community-operators-sznp4\" (UID: \"45d20085-b2be-451d-a12b-24f075a9b97f\") " pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:25 crc kubenswrapper[4931]: I1201 15:15:25.072304 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d20085-b2be-451d-a12b-24f075a9b97f-catalog-content\") pod \"community-operators-sznp4\" (UID: \"45d20085-b2be-451d-a12b-24f075a9b97f\") " pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:25 crc kubenswrapper[4931]: I1201 15:15:25.073089 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d20085-b2be-451d-a12b-24f075a9b97f-utilities\") pod \"community-operators-sznp4\" (UID: \"45d20085-b2be-451d-a12b-24f075a9b97f\") " pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:25 crc kubenswrapper[4931]: I1201 15:15:25.073174 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d20085-b2be-451d-a12b-24f075a9b97f-catalog-content\") pod \"community-operators-sznp4\" (UID: \"45d20085-b2be-451d-a12b-24f075a9b97f\") " pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:25 crc kubenswrapper[4931]: I1201 15:15:25.092931 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jn2k\" (UniqueName: \"kubernetes.io/projected/45d20085-b2be-451d-a12b-24f075a9b97f-kube-api-access-8jn2k\") pod \"community-operators-sznp4\" (UID: \"45d20085-b2be-451d-a12b-24f075a9b97f\") " pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:25 crc kubenswrapper[4931]: I1201 15:15:25.216564 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:25 crc kubenswrapper[4931]: I1201 15:15:25.694680 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qb2rl" event={"ID":"d427cc71-929e-42ae-bc96-87360ba8c005","Type":"ContainerStarted","Data":"502ce464ca304601c75d4cd9cc918c8217a26c5f9ebd602625b5e11f9dffa8c5"} Dec 01 15:15:25 crc kubenswrapper[4931]: I1201 15:15:25.695350 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qb2rl" event={"ID":"d427cc71-929e-42ae-bc96-87360ba8c005","Type":"ContainerStarted","Data":"2cd9ebc31cd12b8735edd458cec7d5e53a0a489455beb61e20113f6e0cf75adc"} Dec 01 15:15:25 crc kubenswrapper[4931]: I1201 15:15:25.695365 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qb2rl" event={"ID":"d427cc71-929e-42ae-bc96-87360ba8c005","Type":"ContainerStarted","Data":"56164a4df3aa2357eee90e5598872b74eba070c1342c6ef72c4cc8b9da78da9b"} Dec 01 15:15:25 crc kubenswrapper[4931]: I1201 15:15:25.695633 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-qb2rl" Dec 01 15:15:25 crc kubenswrapper[4931]: I1201 15:15:25.798371 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-qb2rl" podStartSLOduration=5.798350278 podStartE2EDuration="5.798350278s" podCreationTimestamp="2025-12-01 15:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:15:25.726129688 +0000 UTC m=+872.152003355" watchObservedRunningTime="2025-12-01 15:15:25.798350278 +0000 UTC m=+872.224223945" Dec 01 15:15:25 crc kubenswrapper[4931]: I1201 15:15:25.802052 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sznp4"] Dec 01 15:15:25 crc kubenswrapper[4931]: W1201 15:15:25.817448 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45d20085_b2be_451d_a12b_24f075a9b97f.slice/crio-171b26253509ec469522ccaab2c3ecb60b1eb7143a1604226956f595b3003a76 WatchSource:0}: Error finding container 171b26253509ec469522ccaab2c3ecb60b1eb7143a1604226956f595b3003a76: Status 404 returned error can't find the container with id 171b26253509ec469522ccaab2c3ecb60b1eb7143a1604226956f595b3003a76 Dec 01 15:15:26 crc kubenswrapper[4931]: I1201 15:15:26.760877 4931 generic.go:334] "Generic (PLEG): container finished" podID="45d20085-b2be-451d-a12b-24f075a9b97f" containerID="ebe58053dfbda9171f1d0dcb1960e2513aec20bcfe1d7815a5ce807fcbaad219" exitCode=0 Dec 01 15:15:26 crc kubenswrapper[4931]: I1201 15:15:26.760939 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sznp4" event={"ID":"45d20085-b2be-451d-a12b-24f075a9b97f","Type":"ContainerDied","Data":"ebe58053dfbda9171f1d0dcb1960e2513aec20bcfe1d7815a5ce807fcbaad219"} Dec 01 15:15:26 crc kubenswrapper[4931]: I1201 15:15:26.761584 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sznp4" event={"ID":"45d20085-b2be-451d-a12b-24f075a9b97f","Type":"ContainerStarted","Data":"171b26253509ec469522ccaab2c3ecb60b1eb7143a1604226956f595b3003a76"} Dec 01 15:15:30 crc kubenswrapper[4931]: I1201 15:15:30.803048 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg" event={"ID":"84cf14c6-6a50-4863-98a4-caab8b1f5636","Type":"ContainerStarted","Data":"b75fb37d91226c44f9d0a2b0077db0bdf0ef5ef9e00657aa84aabe36f7270442"} Dec 01 15:15:30 crc kubenswrapper[4931]: I1201 15:15:30.805612 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg" Dec 01 15:15:30 crc kubenswrapper[4931]: I1201 15:15:30.812118 4931 generic.go:334] "Generic (PLEG): container finished" podID="f1e866ab-084b-436d-86bc-97b7a45e8515" containerID="48915129fd5834478b72e818e13f2e4862abcc25bb981f11bcc1f672413a5b3f" exitCode=0 Dec 01 15:15:30 crc kubenswrapper[4931]: I1201 15:15:30.812315 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ch7gc" event={"ID":"f1e866ab-084b-436d-86bc-97b7a45e8515","Type":"ContainerDied","Data":"48915129fd5834478b72e818e13f2e4862abcc25bb981f11bcc1f672413a5b3f"} Dec 01 15:15:30 crc kubenswrapper[4931]: I1201 15:15:30.816951 4931 generic.go:334] "Generic (PLEG): container finished" podID="45d20085-b2be-451d-a12b-24f075a9b97f" containerID="a1edc07949761aa4374cd27523cad5d06d567c208d9b86be6dd459c1bb90d256" exitCode=0 Dec 01 15:15:30 crc kubenswrapper[4931]: I1201 15:15:30.817195 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sznp4" event={"ID":"45d20085-b2be-451d-a12b-24f075a9b97f","Type":"ContainerDied","Data":"a1edc07949761aa4374cd27523cad5d06d567c208d9b86be6dd459c1bb90d256"} Dec 01 15:15:30 crc kubenswrapper[4931]: I1201 15:15:30.837672 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg" podStartSLOduration=2.4849412810000002 podStartE2EDuration="10.837625702s" podCreationTimestamp="2025-12-01 15:15:20 +0000 UTC" firstStartedPulling="2025-12-01 15:15:21.260839412 +0000 UTC m=+867.686713079" lastFinishedPulling="2025-12-01 15:15:29.613523833 +0000 UTC m=+876.039397500" observedRunningTime="2025-12-01 15:15:30.826891933 +0000 UTC m=+877.252765610" watchObservedRunningTime="2025-12-01 15:15:30.837625702 +0000 UTC m=+877.263499379" Dec 01 15:15:31 crc kubenswrapper[4931]: I1201 15:15:31.830997 4931 generic.go:334] "Generic (PLEG): container finished" podID="f1e866ab-084b-436d-86bc-97b7a45e8515" containerID="5f4e93fc2931202b017063dc3080b54458395a8c4f883e17afb4f338f4612d12" exitCode=0 Dec 01 15:15:31 crc kubenswrapper[4931]: I1201 15:15:31.831135 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ch7gc" event={"ID":"f1e866ab-084b-436d-86bc-97b7a45e8515","Type":"ContainerDied","Data":"5f4e93fc2931202b017063dc3080b54458395a8c4f883e17afb4f338f4612d12"} Dec 01 15:15:31 crc kubenswrapper[4931]: I1201 15:15:31.835718 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sznp4" event={"ID":"45d20085-b2be-451d-a12b-24f075a9b97f","Type":"ContainerStarted","Data":"15b1a3b2cfa7fa0e762ad4f4db469f38dfd688be8fb1384b1400861c682e2916"} Dec 01 15:15:31 crc kubenswrapper[4931]: I1201 15:15:31.900260 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sznp4" podStartSLOduration=3.318277428 podStartE2EDuration="7.900225252s" podCreationTimestamp="2025-12-01 15:15:24 +0000 UTC" firstStartedPulling="2025-12-01 15:15:26.764781639 +0000 UTC m=+873.190655306" lastFinishedPulling="2025-12-01 15:15:31.346729463 +0000 UTC m=+877.772603130" observedRunningTime="2025-12-01 15:15:31.897925446 +0000 UTC m=+878.323799113" watchObservedRunningTime="2025-12-01 15:15:31.900225252 +0000 UTC m=+878.326098949" Dec 01 15:15:32 crc kubenswrapper[4931]: I1201 15:15:32.843964 4931 generic.go:334] "Generic (PLEG): container finished" podID="f1e866ab-084b-436d-86bc-97b7a45e8515" containerID="9e42cbbe2164e414b8ac51b2b43af02d04a4033392eb6e025e7b380a6e8705e9" exitCode=0 Dec 01 15:15:32 crc kubenswrapper[4931]: I1201 15:15:32.844062 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ch7gc" event={"ID":"f1e866ab-084b-436d-86bc-97b7a45e8515","Type":"ContainerDied","Data":"9e42cbbe2164e414b8ac51b2b43af02d04a4033392eb6e025e7b380a6e8705e9"} Dec 01 15:15:33 crc kubenswrapper[4931]: I1201 15:15:33.858315 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ch7gc" event={"ID":"f1e866ab-084b-436d-86bc-97b7a45e8515","Type":"ContainerStarted","Data":"ad24921e929c92ef6a376e0d79fb3f8b0b6bfc817034e4bddaefedea3145a03f"} Dec 01 15:15:33 crc kubenswrapper[4931]: I1201 15:15:33.858843 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ch7gc" event={"ID":"f1e866ab-084b-436d-86bc-97b7a45e8515","Type":"ContainerStarted","Data":"c355475cf8614e32e839470908f3a5648bfc8cd7de8779503ae296e8e095ef97"} Dec 01 15:15:33 crc kubenswrapper[4931]: I1201 15:15:33.858858 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ch7gc" event={"ID":"f1e866ab-084b-436d-86bc-97b7a45e8515","Type":"ContainerStarted","Data":"e9e32347d48281eb0937264e6fb7dbcb0a7601b2ac2c12518a55fab64c829cb2"} Dec 01 15:15:33 crc kubenswrapper[4931]: I1201 15:15:33.858871 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ch7gc" event={"ID":"f1e866ab-084b-436d-86bc-97b7a45e8515","Type":"ContainerStarted","Data":"5c8d1c9e0fe9af6b871a88b56ec444ccb7e37cca1c6858c672b3d07d2c4a591f"} Dec 01 15:15:34 crc kubenswrapper[4931]: I1201 15:15:34.872091 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ch7gc" event={"ID":"f1e866ab-084b-436d-86bc-97b7a45e8515","Type":"ContainerStarted","Data":"e2f9edaedc1e230e8b163f111fc024f33777183888d3371e1e2322586f0967da"} Dec 01 15:15:34 crc kubenswrapper[4931]: I1201 15:15:34.872161 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ch7gc" event={"ID":"f1e866ab-084b-436d-86bc-97b7a45e8515","Type":"ContainerStarted","Data":"11efd70e19767f140362f8c31cbc03ea67df96f0a2b30684db114735c62c4f2a"} Dec 01 15:15:34 crc kubenswrapper[4931]: I1201 15:15:34.872600 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:34 crc kubenswrapper[4931]: I1201 15:15:34.905773 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-ch7gc" podStartSLOduration=6.52700036 podStartE2EDuration="14.905753891s" podCreationTimestamp="2025-12-01 15:15:20 +0000 UTC" firstStartedPulling="2025-12-01 15:15:21.210457512 +0000 UTC m=+867.636331179" lastFinishedPulling="2025-12-01 15:15:29.589211043 +0000 UTC m=+876.015084710" observedRunningTime="2025-12-01 15:15:34.903075654 +0000 UTC m=+881.328949341" watchObservedRunningTime="2025-12-01 15:15:34.905753891 +0000 UTC m=+881.331627558" Dec 01 15:15:35 crc kubenswrapper[4931]: I1201 15:15:35.217226 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:35 crc kubenswrapper[4931]: I1201 15:15:35.217328 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:35 crc kubenswrapper[4931]: I1201 15:15:35.284523 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:35 crc kubenswrapper[4931]: I1201 15:15:35.975579 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:36 crc kubenswrapper[4931]: I1201 15:15:36.029008 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:40 crc kubenswrapper[4931]: I1201 15:15:40.966656 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-rj9kg" Dec 01 15:15:41 crc kubenswrapper[4931]: I1201 15:15:41.693416 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-jcr8j" Dec 01 15:15:44 crc kubenswrapper[4931]: I1201 15:15:44.670591 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-qb2rl" Dec 01 15:15:45 crc kubenswrapper[4931]: I1201 15:15:45.301973 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:45 crc kubenswrapper[4931]: I1201 15:15:45.370563 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sznp4"] Dec 01 15:15:45 crc kubenswrapper[4931]: I1201 15:15:45.974606 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sznp4" podUID="45d20085-b2be-451d-a12b-24f075a9b97f" containerName="registry-server" containerID="cri-o://15b1a3b2cfa7fa0e762ad4f4db469f38dfd688be8fb1384b1400861c682e2916" gracePeriod=2 Dec 01 15:15:46 crc kubenswrapper[4931]: I1201 15:15:46.952751 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r2t9j"] Dec 01 15:15:46 crc kubenswrapper[4931]: I1201 15:15:46.954096 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:46 crc kubenswrapper[4931]: I1201 15:15:46.975016 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2t9j"] Dec 01 15:15:46 crc kubenswrapper[4931]: I1201 15:15:46.985541 4931 generic.go:334] "Generic (PLEG): container finished" podID="45d20085-b2be-451d-a12b-24f075a9b97f" containerID="15b1a3b2cfa7fa0e762ad4f4db469f38dfd688be8fb1384b1400861c682e2916" exitCode=0 Dec 01 15:15:46 crc kubenswrapper[4931]: I1201 15:15:46.985602 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sznp4" event={"ID":"45d20085-b2be-451d-a12b-24f075a9b97f","Type":"ContainerDied","Data":"15b1a3b2cfa7fa0e762ad4f4db469f38dfd688be8fb1384b1400861c682e2916"} Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.030314 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc86188c-a8ef-4369-a95f-e9f4255db634-utilities\") pod \"redhat-marketplace-r2t9j\" (UID: \"bc86188c-a8ef-4369-a95f-e9f4255db634\") " pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.030529 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngvl8\" (UniqueName: \"kubernetes.io/projected/bc86188c-a8ef-4369-a95f-e9f4255db634-kube-api-access-ngvl8\") pod \"redhat-marketplace-r2t9j\" (UID: \"bc86188c-a8ef-4369-a95f-e9f4255db634\") " pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.030590 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc86188c-a8ef-4369-a95f-e9f4255db634-catalog-content\") pod \"redhat-marketplace-r2t9j\" (UID: \"bc86188c-a8ef-4369-a95f-e9f4255db634\") " pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.132241 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc86188c-a8ef-4369-a95f-e9f4255db634-utilities\") pod \"redhat-marketplace-r2t9j\" (UID: \"bc86188c-a8ef-4369-a95f-e9f4255db634\") " pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.132355 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngvl8\" (UniqueName: \"kubernetes.io/projected/bc86188c-a8ef-4369-a95f-e9f4255db634-kube-api-access-ngvl8\") pod \"redhat-marketplace-r2t9j\" (UID: \"bc86188c-a8ef-4369-a95f-e9f4255db634\") " pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.132403 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc86188c-a8ef-4369-a95f-e9f4255db634-catalog-content\") pod \"redhat-marketplace-r2t9j\" (UID: \"bc86188c-a8ef-4369-a95f-e9f4255db634\") " pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.132955 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc86188c-a8ef-4369-a95f-e9f4255db634-utilities\") pod \"redhat-marketplace-r2t9j\" (UID: \"bc86188c-a8ef-4369-a95f-e9f4255db634\") " pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.132974 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc86188c-a8ef-4369-a95f-e9f4255db634-catalog-content\") pod \"redhat-marketplace-r2t9j\" (UID: \"bc86188c-a8ef-4369-a95f-e9f4255db634\") " pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.162230 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngvl8\" (UniqueName: \"kubernetes.io/projected/bc86188c-a8ef-4369-a95f-e9f4255db634-kube-api-access-ngvl8\") pod \"redhat-marketplace-r2t9j\" (UID: \"bc86188c-a8ef-4369-a95f-e9f4255db634\") " pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.301321 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.539217 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.641307 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d20085-b2be-451d-a12b-24f075a9b97f-utilities\") pod \"45d20085-b2be-451d-a12b-24f075a9b97f\" (UID: \"45d20085-b2be-451d-a12b-24f075a9b97f\") " Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.641418 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d20085-b2be-451d-a12b-24f075a9b97f-catalog-content\") pod \"45d20085-b2be-451d-a12b-24f075a9b97f\" (UID: \"45d20085-b2be-451d-a12b-24f075a9b97f\") " Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.641579 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jn2k\" (UniqueName: \"kubernetes.io/projected/45d20085-b2be-451d-a12b-24f075a9b97f-kube-api-access-8jn2k\") pod \"45d20085-b2be-451d-a12b-24f075a9b97f\" (UID: \"45d20085-b2be-451d-a12b-24f075a9b97f\") " Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.642456 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d20085-b2be-451d-a12b-24f075a9b97f-utilities" (OuterVolumeSpecName: "utilities") pod "45d20085-b2be-451d-a12b-24f075a9b97f" (UID: "45d20085-b2be-451d-a12b-24f075a9b97f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.648947 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d20085-b2be-451d-a12b-24f075a9b97f-kube-api-access-8jn2k" (OuterVolumeSpecName: "kube-api-access-8jn2k") pod "45d20085-b2be-451d-a12b-24f075a9b97f" (UID: "45d20085-b2be-451d-a12b-24f075a9b97f"). InnerVolumeSpecName "kube-api-access-8jn2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.697719 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d20085-b2be-451d-a12b-24f075a9b97f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45d20085-b2be-451d-a12b-24f075a9b97f" (UID: "45d20085-b2be-451d-a12b-24f075a9b97f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.743096 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jn2k\" (UniqueName: \"kubernetes.io/projected/45d20085-b2be-451d-a12b-24f075a9b97f-kube-api-access-8jn2k\") on node \"crc\" DevicePath \"\"" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.743136 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d20085-b2be-451d-a12b-24f075a9b97f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.743149 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d20085-b2be-451d-a12b-24f075a9b97f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.756855 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2t9j"] Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.992324 4931 generic.go:334] "Generic (PLEG): container finished" podID="bc86188c-a8ef-4369-a95f-e9f4255db634" containerID="a4cbfadeba4504e52a8837804d2d1d762b2141cb8c35a0f4e3a687c8734b178c" exitCode=0 Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.992445 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2t9j" event={"ID":"bc86188c-a8ef-4369-a95f-e9f4255db634","Type":"ContainerDied","Data":"a4cbfadeba4504e52a8837804d2d1d762b2141cb8c35a0f4e3a687c8734b178c"} Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.992505 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2t9j" event={"ID":"bc86188c-a8ef-4369-a95f-e9f4255db634","Type":"ContainerStarted","Data":"85f7dd94d7f526042bf8eab62efb6deda68d759958ae9a0db1afb850ae23ebc9"} Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.995640 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sznp4" event={"ID":"45d20085-b2be-451d-a12b-24f075a9b97f","Type":"ContainerDied","Data":"171b26253509ec469522ccaab2c3ecb60b1eb7143a1604226956f595b3003a76"} Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.995693 4931 scope.go:117] "RemoveContainer" containerID="15b1a3b2cfa7fa0e762ad4f4db469f38dfd688be8fb1384b1400861c682e2916" Dec 01 15:15:47 crc kubenswrapper[4931]: I1201 15:15:47.995710 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sznp4" Dec 01 15:15:48 crc kubenswrapper[4931]: I1201 15:15:48.014135 4931 scope.go:117] "RemoveContainer" containerID="a1edc07949761aa4374cd27523cad5d06d567c208d9b86be6dd459c1bb90d256" Dec 01 15:15:48 crc kubenswrapper[4931]: I1201 15:15:48.028836 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sznp4"] Dec 01 15:15:48 crc kubenswrapper[4931]: I1201 15:15:48.033120 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sznp4"] Dec 01 15:15:48 crc kubenswrapper[4931]: I1201 15:15:48.042355 4931 scope.go:117] "RemoveContainer" containerID="ebe58053dfbda9171f1d0dcb1960e2513aec20bcfe1d7815a5ce807fcbaad219" Dec 01 15:15:48 crc kubenswrapper[4931]: I1201 15:15:48.253513 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d20085-b2be-451d-a12b-24f075a9b97f" path="/var/lib/kubelet/pods/45d20085-b2be-451d-a12b-24f075a9b97f/volumes" Dec 01 15:15:49 crc kubenswrapper[4931]: I1201 15:15:49.006548 4931 generic.go:334] "Generic (PLEG): container finished" podID="bc86188c-a8ef-4369-a95f-e9f4255db634" containerID="92e909cdebb0cafc2e90adeabc2f23d771aaf1aabc23a74c0a09919d8852e87b" exitCode=0 Dec 01 15:15:49 crc kubenswrapper[4931]: I1201 15:15:49.006796 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2t9j" event={"ID":"bc86188c-a8ef-4369-a95f-e9f4255db634","Type":"ContainerDied","Data":"92e909cdebb0cafc2e90adeabc2f23d771aaf1aabc23a74c0a09919d8852e87b"} Dec 01 15:15:50 crc kubenswrapper[4931]: I1201 15:15:50.017133 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2t9j" event={"ID":"bc86188c-a8ef-4369-a95f-e9f4255db634","Type":"ContainerStarted","Data":"2cd6153298f0b157632a234d5f37c51a498fbc9e97d32b02f16adb8e2eac67fe"} Dec 01 15:15:50 crc kubenswrapper[4931]: I1201 15:15:50.042635 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r2t9j" podStartSLOduration=2.237960085 podStartE2EDuration="4.042612913s" podCreationTimestamp="2025-12-01 15:15:46 +0000 UTC" firstStartedPulling="2025-12-01 15:15:47.995258136 +0000 UTC m=+894.421131803" lastFinishedPulling="2025-12-01 15:15:49.799910954 +0000 UTC m=+896.225784631" observedRunningTime="2025-12-01 15:15:50.03970376 +0000 UTC m=+896.465577437" watchObservedRunningTime="2025-12-01 15:15:50.042612913 +0000 UTC m=+896.468486600" Dec 01 15:15:50 crc kubenswrapper[4931]: I1201 15:15:50.978905 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-ch7gc" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.358412 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-72slc"] Dec 01 15:15:52 crc kubenswrapper[4931]: E1201 15:15:52.358815 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d20085-b2be-451d-a12b-24f075a9b97f" containerName="extract-content" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.358832 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d20085-b2be-451d-a12b-24f075a9b97f" containerName="extract-content" Dec 01 15:15:52 crc kubenswrapper[4931]: E1201 15:15:52.358849 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d20085-b2be-451d-a12b-24f075a9b97f" containerName="registry-server" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.358855 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d20085-b2be-451d-a12b-24f075a9b97f" containerName="registry-server" Dec 01 15:15:52 crc kubenswrapper[4931]: E1201 15:15:52.358870 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d20085-b2be-451d-a12b-24f075a9b97f" containerName="extract-utilities" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.358876 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d20085-b2be-451d-a12b-24f075a9b97f" containerName="extract-utilities" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.359063 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d20085-b2be-451d-a12b-24f075a9b97f" containerName="registry-server" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.359757 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72slc" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.363018 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.363299 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-vxgk5" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.363475 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.366017 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-72slc"] Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.414933 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5q5b\" (UniqueName: \"kubernetes.io/projected/6fff8f62-0a00-45e5-9e66-bd92dff14023-kube-api-access-k5q5b\") pod \"openstack-operator-index-72slc\" (UID: \"6fff8f62-0a00-45e5-9e66-bd92dff14023\") " pod="openstack-operators/openstack-operator-index-72slc" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.516252 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5q5b\" (UniqueName: \"kubernetes.io/projected/6fff8f62-0a00-45e5-9e66-bd92dff14023-kube-api-access-k5q5b\") pod \"openstack-operator-index-72slc\" (UID: \"6fff8f62-0a00-45e5-9e66-bd92dff14023\") " pod="openstack-operators/openstack-operator-index-72slc" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.543911 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5q5b\" (UniqueName: \"kubernetes.io/projected/6fff8f62-0a00-45e5-9e66-bd92dff14023-kube-api-access-k5q5b\") pod \"openstack-operator-index-72slc\" (UID: \"6fff8f62-0a00-45e5-9e66-bd92dff14023\") " pod="openstack-operators/openstack-operator-index-72slc" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.708883 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72slc" Dec 01 15:15:52 crc kubenswrapper[4931]: I1201 15:15:52.970788 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-72slc"] Dec 01 15:15:52 crc kubenswrapper[4931]: W1201 15:15:52.977626 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fff8f62_0a00_45e5_9e66_bd92dff14023.slice/crio-6e99c5a4ebacea8c728b6d800ad4a77bb24fddc06e3b59fb43c600f674770e19 WatchSource:0}: Error finding container 6e99c5a4ebacea8c728b6d800ad4a77bb24fddc06e3b59fb43c600f674770e19: Status 404 returned error can't find the container with id 6e99c5a4ebacea8c728b6d800ad4a77bb24fddc06e3b59fb43c600f674770e19 Dec 01 15:15:53 crc kubenswrapper[4931]: I1201 15:15:53.041822 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72slc" event={"ID":"6fff8f62-0a00-45e5-9e66-bd92dff14023","Type":"ContainerStarted","Data":"6e99c5a4ebacea8c728b6d800ad4a77bb24fddc06e3b59fb43c600f674770e19"} Dec 01 15:15:57 crc kubenswrapper[4931]: I1201 15:15:57.075102 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72slc" event={"ID":"6fff8f62-0a00-45e5-9e66-bd92dff14023","Type":"ContainerStarted","Data":"e1cf2603547b11c03af65d35406acb7ef5db3457989729c2473303ebf85a9df6"} Dec 01 15:15:57 crc kubenswrapper[4931]: I1201 15:15:57.104500 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-72slc" podStartSLOduration=2.15755296 podStartE2EDuration="5.104465901s" podCreationTimestamp="2025-12-01 15:15:52 +0000 UTC" firstStartedPulling="2025-12-01 15:15:52.981510984 +0000 UTC m=+899.407384681" lastFinishedPulling="2025-12-01 15:15:55.928423955 +0000 UTC m=+902.354297622" observedRunningTime="2025-12-01 15:15:57.102503205 +0000 UTC m=+903.528376872" watchObservedRunningTime="2025-12-01 15:15:57.104465901 +0000 UTC m=+903.530339608" Dec 01 15:15:57 crc kubenswrapper[4931]: I1201 15:15:57.302712 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:57 crc kubenswrapper[4931]: I1201 15:15:57.302797 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:57 crc kubenswrapper[4931]: I1201 15:15:57.351728 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:15:58 crc kubenswrapper[4931]: I1201 15:15:58.128611 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:16:01 crc kubenswrapper[4931]: I1201 15:16:01.148401 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2t9j"] Dec 01 15:16:01 crc kubenswrapper[4931]: I1201 15:16:01.148750 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r2t9j" podUID="bc86188c-a8ef-4369-a95f-e9f4255db634" containerName="registry-server" containerID="cri-o://2cd6153298f0b157632a234d5f37c51a498fbc9e97d32b02f16adb8e2eac67fe" gracePeriod=2 Dec 01 15:16:01 crc kubenswrapper[4931]: I1201 15:16:01.611871 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:16:01 crc kubenswrapper[4931]: I1201 15:16:01.785743 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvl8\" (UniqueName: \"kubernetes.io/projected/bc86188c-a8ef-4369-a95f-e9f4255db634-kube-api-access-ngvl8\") pod \"bc86188c-a8ef-4369-a95f-e9f4255db634\" (UID: \"bc86188c-a8ef-4369-a95f-e9f4255db634\") " Dec 01 15:16:01 crc kubenswrapper[4931]: I1201 15:16:01.785815 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc86188c-a8ef-4369-a95f-e9f4255db634-utilities\") pod \"bc86188c-a8ef-4369-a95f-e9f4255db634\" (UID: \"bc86188c-a8ef-4369-a95f-e9f4255db634\") " Dec 01 15:16:01 crc kubenswrapper[4931]: I1201 15:16:01.785896 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc86188c-a8ef-4369-a95f-e9f4255db634-catalog-content\") pod \"bc86188c-a8ef-4369-a95f-e9f4255db634\" (UID: \"bc86188c-a8ef-4369-a95f-e9f4255db634\") " Dec 01 15:16:01 crc kubenswrapper[4931]: I1201 15:16:01.787567 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc86188c-a8ef-4369-a95f-e9f4255db634-utilities" (OuterVolumeSpecName: "utilities") pod "bc86188c-a8ef-4369-a95f-e9f4255db634" (UID: "bc86188c-a8ef-4369-a95f-e9f4255db634"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:16:01 crc kubenswrapper[4931]: I1201 15:16:01.793726 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc86188c-a8ef-4369-a95f-e9f4255db634-kube-api-access-ngvl8" (OuterVolumeSpecName: "kube-api-access-ngvl8") pod "bc86188c-a8ef-4369-a95f-e9f4255db634" (UID: "bc86188c-a8ef-4369-a95f-e9f4255db634"). InnerVolumeSpecName "kube-api-access-ngvl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:16:01 crc kubenswrapper[4931]: I1201 15:16:01.805586 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc86188c-a8ef-4369-a95f-e9f4255db634-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc86188c-a8ef-4369-a95f-e9f4255db634" (UID: "bc86188c-a8ef-4369-a95f-e9f4255db634"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:16:01 crc kubenswrapper[4931]: I1201 15:16:01.888217 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc86188c-a8ef-4369-a95f-e9f4255db634-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:16:01 crc kubenswrapper[4931]: I1201 15:16:01.888253 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc86188c-a8ef-4369-a95f-e9f4255db634-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:16:01 crc kubenswrapper[4931]: I1201 15:16:01.888264 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvl8\" (UniqueName: \"kubernetes.io/projected/bc86188c-a8ef-4369-a95f-e9f4255db634-kube-api-access-ngvl8\") on node \"crc\" DevicePath \"\"" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.117149 4931 generic.go:334] "Generic (PLEG): container finished" podID="bc86188c-a8ef-4369-a95f-e9f4255db634" containerID="2cd6153298f0b157632a234d5f37c51a498fbc9e97d32b02f16adb8e2eac67fe" exitCode=0 Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.117316 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2t9j" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.117272 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2t9j" event={"ID":"bc86188c-a8ef-4369-a95f-e9f4255db634","Type":"ContainerDied","Data":"2cd6153298f0b157632a234d5f37c51a498fbc9e97d32b02f16adb8e2eac67fe"} Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.117451 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2t9j" event={"ID":"bc86188c-a8ef-4369-a95f-e9f4255db634","Type":"ContainerDied","Data":"85f7dd94d7f526042bf8eab62efb6deda68d759958ae9a0db1afb850ae23ebc9"} Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.117479 4931 scope.go:117] "RemoveContainer" containerID="2cd6153298f0b157632a234d5f37c51a498fbc9e97d32b02f16adb8e2eac67fe" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.143437 4931 scope.go:117] "RemoveContainer" containerID="92e909cdebb0cafc2e90adeabc2f23d771aaf1aabc23a74c0a09919d8852e87b" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.165180 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2t9j"] Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.170870 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2t9j"] Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.181847 4931 scope.go:117] "RemoveContainer" containerID="a4cbfadeba4504e52a8837804d2d1d762b2141cb8c35a0f4e3a687c8734b178c" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.201263 4931 scope.go:117] "RemoveContainer" containerID="2cd6153298f0b157632a234d5f37c51a498fbc9e97d32b02f16adb8e2eac67fe" Dec 01 15:16:02 crc kubenswrapper[4931]: E1201 15:16:02.201718 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd6153298f0b157632a234d5f37c51a498fbc9e97d32b02f16adb8e2eac67fe\": container with ID starting with 2cd6153298f0b157632a234d5f37c51a498fbc9e97d32b02f16adb8e2eac67fe not found: ID does not exist" containerID="2cd6153298f0b157632a234d5f37c51a498fbc9e97d32b02f16adb8e2eac67fe" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.201776 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd6153298f0b157632a234d5f37c51a498fbc9e97d32b02f16adb8e2eac67fe"} err="failed to get container status \"2cd6153298f0b157632a234d5f37c51a498fbc9e97d32b02f16adb8e2eac67fe\": rpc error: code = NotFound desc = could not find container \"2cd6153298f0b157632a234d5f37c51a498fbc9e97d32b02f16adb8e2eac67fe\": container with ID starting with 2cd6153298f0b157632a234d5f37c51a498fbc9e97d32b02f16adb8e2eac67fe not found: ID does not exist" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.201810 4931 scope.go:117] "RemoveContainer" containerID="92e909cdebb0cafc2e90adeabc2f23d771aaf1aabc23a74c0a09919d8852e87b" Dec 01 15:16:02 crc kubenswrapper[4931]: E1201 15:16:02.202165 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92e909cdebb0cafc2e90adeabc2f23d771aaf1aabc23a74c0a09919d8852e87b\": container with ID starting with 92e909cdebb0cafc2e90adeabc2f23d771aaf1aabc23a74c0a09919d8852e87b not found: ID does not exist" containerID="92e909cdebb0cafc2e90adeabc2f23d771aaf1aabc23a74c0a09919d8852e87b" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.202205 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92e909cdebb0cafc2e90adeabc2f23d771aaf1aabc23a74c0a09919d8852e87b"} err="failed to get container status \"92e909cdebb0cafc2e90adeabc2f23d771aaf1aabc23a74c0a09919d8852e87b\": rpc error: code = NotFound desc = could not find container \"92e909cdebb0cafc2e90adeabc2f23d771aaf1aabc23a74c0a09919d8852e87b\": container with ID starting with 92e909cdebb0cafc2e90adeabc2f23d771aaf1aabc23a74c0a09919d8852e87b not found: ID does not exist" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.202236 4931 scope.go:117] "RemoveContainer" containerID="a4cbfadeba4504e52a8837804d2d1d762b2141cb8c35a0f4e3a687c8734b178c" Dec 01 15:16:02 crc kubenswrapper[4931]: E1201 15:16:02.202472 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4cbfadeba4504e52a8837804d2d1d762b2141cb8c35a0f4e3a687c8734b178c\": container with ID starting with a4cbfadeba4504e52a8837804d2d1d762b2141cb8c35a0f4e3a687c8734b178c not found: ID does not exist" containerID="a4cbfadeba4504e52a8837804d2d1d762b2141cb8c35a0f4e3a687c8734b178c" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.202504 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4cbfadeba4504e52a8837804d2d1d762b2141cb8c35a0f4e3a687c8734b178c"} err="failed to get container status \"a4cbfadeba4504e52a8837804d2d1d762b2141cb8c35a0f4e3a687c8734b178c\": rpc error: code = NotFound desc = could not find container \"a4cbfadeba4504e52a8837804d2d1d762b2141cb8c35a0f4e3a687c8734b178c\": container with ID starting with a4cbfadeba4504e52a8837804d2d1d762b2141cb8c35a0f4e3a687c8734b178c not found: ID does not exist" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.253825 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc86188c-a8ef-4369-a95f-e9f4255db634" path="/var/lib/kubelet/pods/bc86188c-a8ef-4369-a95f-e9f4255db634/volumes" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.710059 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-72slc" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.710149 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-72slc" Dec 01 15:16:02 crc kubenswrapper[4931]: I1201 15:16:02.748062 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-72slc" Dec 01 15:16:03 crc kubenswrapper[4931]: I1201 15:16:03.176149 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-72slc" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.615569 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2"] Dec 01 15:16:06 crc kubenswrapper[4931]: E1201 15:16:06.616552 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc86188c-a8ef-4369-a95f-e9f4255db634" containerName="extract-utilities" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.616579 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc86188c-a8ef-4369-a95f-e9f4255db634" containerName="extract-utilities" Dec 01 15:16:06 crc kubenswrapper[4931]: E1201 15:16:06.616626 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc86188c-a8ef-4369-a95f-e9f4255db634" containerName="registry-server" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.616640 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc86188c-a8ef-4369-a95f-e9f4255db634" containerName="registry-server" Dec 01 15:16:06 crc kubenswrapper[4931]: E1201 15:16:06.616659 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc86188c-a8ef-4369-a95f-e9f4255db634" containerName="extract-content" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.616671 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc86188c-a8ef-4369-a95f-e9f4255db634" containerName="extract-content" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.616896 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc86188c-a8ef-4369-a95f-e9f4255db634" containerName="registry-server" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.618580 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.624111 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-n66gg" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.632826 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2"] Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.772793 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f08eaf36-78a8-4183-b663-22eaefe5cb6b-bundle\") pod \"9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2\" (UID: \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\") " pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.772887 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f08eaf36-78a8-4183-b663-22eaefe5cb6b-util\") pod \"9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2\" (UID: \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\") " pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.772930 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64lgz\" (UniqueName: \"kubernetes.io/projected/f08eaf36-78a8-4183-b663-22eaefe5cb6b-kube-api-access-64lgz\") pod \"9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2\" (UID: \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\") " pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.874521 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f08eaf36-78a8-4183-b663-22eaefe5cb6b-util\") pod \"9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2\" (UID: \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\") " pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.874582 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64lgz\" (UniqueName: \"kubernetes.io/projected/f08eaf36-78a8-4183-b663-22eaefe5cb6b-kube-api-access-64lgz\") pod \"9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2\" (UID: \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\") " pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.874655 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f08eaf36-78a8-4183-b663-22eaefe5cb6b-bundle\") pod \"9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2\" (UID: \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\") " pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.875589 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f08eaf36-78a8-4183-b663-22eaefe5cb6b-bundle\") pod \"9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2\" (UID: \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\") " pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.875984 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f08eaf36-78a8-4183-b663-22eaefe5cb6b-util\") pod \"9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2\" (UID: \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\") " pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.899271 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64lgz\" (UniqueName: \"kubernetes.io/projected/f08eaf36-78a8-4183-b663-22eaefe5cb6b-kube-api-access-64lgz\") pod \"9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2\" (UID: \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\") " pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" Dec 01 15:16:06 crc kubenswrapper[4931]: I1201 15:16:06.944734 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" Dec 01 15:16:07 crc kubenswrapper[4931]: I1201 15:16:07.267153 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2"] Dec 01 15:16:08 crc kubenswrapper[4931]: I1201 15:16:08.188579 4931 generic.go:334] "Generic (PLEG): container finished" podID="f08eaf36-78a8-4183-b663-22eaefe5cb6b" containerID="9b2b1324e64b5ee84e543b41662ba1e8d605ea1fc3dae3a114d7f474e3b5a47f" exitCode=0 Dec 01 15:16:08 crc kubenswrapper[4931]: I1201 15:16:08.188663 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" event={"ID":"f08eaf36-78a8-4183-b663-22eaefe5cb6b","Type":"ContainerDied","Data":"9b2b1324e64b5ee84e543b41662ba1e8d605ea1fc3dae3a114d7f474e3b5a47f"} Dec 01 15:16:08 crc kubenswrapper[4931]: I1201 15:16:08.191459 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" event={"ID":"f08eaf36-78a8-4183-b663-22eaefe5cb6b","Type":"ContainerStarted","Data":"89b2e60278cc41d32668857578a52105f55991957036a5f17910d00b3b359b6b"} Dec 01 15:16:09 crc kubenswrapper[4931]: I1201 15:16:09.200199 4931 generic.go:334] "Generic (PLEG): container finished" podID="f08eaf36-78a8-4183-b663-22eaefe5cb6b" containerID="3f0c2023259c1ae3c51706c765e812565ff37cc79d9352f0b976a6fd0ac6a656" exitCode=0 Dec 01 15:16:09 crc kubenswrapper[4931]: I1201 15:16:09.200283 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" event={"ID":"f08eaf36-78a8-4183-b663-22eaefe5cb6b","Type":"ContainerDied","Data":"3f0c2023259c1ae3c51706c765e812565ff37cc79d9352f0b976a6fd0ac6a656"} Dec 01 15:16:10 crc kubenswrapper[4931]: I1201 15:16:10.214271 4931 generic.go:334] "Generic (PLEG): container finished" podID="f08eaf36-78a8-4183-b663-22eaefe5cb6b" containerID="1663c27de7a31368a33ffb3c770117f7ab3286b65bb9e079982109d33054f554" exitCode=0 Dec 01 15:16:10 crc kubenswrapper[4931]: I1201 15:16:10.214365 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" event={"ID":"f08eaf36-78a8-4183-b663-22eaefe5cb6b","Type":"ContainerDied","Data":"1663c27de7a31368a33ffb3c770117f7ab3286b65bb9e079982109d33054f554"} Dec 01 15:16:11 crc kubenswrapper[4931]: I1201 15:16:11.562952 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" Dec 01 15:16:11 crc kubenswrapper[4931]: I1201 15:16:11.647147 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64lgz\" (UniqueName: \"kubernetes.io/projected/f08eaf36-78a8-4183-b663-22eaefe5cb6b-kube-api-access-64lgz\") pod \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\" (UID: \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\") " Dec 01 15:16:11 crc kubenswrapper[4931]: I1201 15:16:11.647249 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f08eaf36-78a8-4183-b663-22eaefe5cb6b-util\") pod \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\" (UID: \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\") " Dec 01 15:16:11 crc kubenswrapper[4931]: I1201 15:16:11.647291 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f08eaf36-78a8-4183-b663-22eaefe5cb6b-bundle\") pod \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\" (UID: \"f08eaf36-78a8-4183-b663-22eaefe5cb6b\") " Dec 01 15:16:11 crc kubenswrapper[4931]: I1201 15:16:11.648342 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f08eaf36-78a8-4183-b663-22eaefe5cb6b-bundle" (OuterVolumeSpecName: "bundle") pod "f08eaf36-78a8-4183-b663-22eaefe5cb6b" (UID: "f08eaf36-78a8-4183-b663-22eaefe5cb6b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:16:11 crc kubenswrapper[4931]: I1201 15:16:11.655709 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f08eaf36-78a8-4183-b663-22eaefe5cb6b-kube-api-access-64lgz" (OuterVolumeSpecName: "kube-api-access-64lgz") pod "f08eaf36-78a8-4183-b663-22eaefe5cb6b" (UID: "f08eaf36-78a8-4183-b663-22eaefe5cb6b"). InnerVolumeSpecName "kube-api-access-64lgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:16:11 crc kubenswrapper[4931]: I1201 15:16:11.662049 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f08eaf36-78a8-4183-b663-22eaefe5cb6b-util" (OuterVolumeSpecName: "util") pod "f08eaf36-78a8-4183-b663-22eaefe5cb6b" (UID: "f08eaf36-78a8-4183-b663-22eaefe5cb6b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:16:11 crc kubenswrapper[4931]: I1201 15:16:11.748800 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64lgz\" (UniqueName: \"kubernetes.io/projected/f08eaf36-78a8-4183-b663-22eaefe5cb6b-kube-api-access-64lgz\") on node \"crc\" DevicePath \"\"" Dec 01 15:16:11 crc kubenswrapper[4931]: I1201 15:16:11.748854 4931 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f08eaf36-78a8-4183-b663-22eaefe5cb6b-util\") on node \"crc\" DevicePath \"\"" Dec 01 15:16:11 crc kubenswrapper[4931]: I1201 15:16:11.748873 4931 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f08eaf36-78a8-4183-b663-22eaefe5cb6b-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:16:12 crc kubenswrapper[4931]: I1201 15:16:12.234972 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" event={"ID":"f08eaf36-78a8-4183-b663-22eaefe5cb6b","Type":"ContainerDied","Data":"89b2e60278cc41d32668857578a52105f55991957036a5f17910d00b3b359b6b"} Dec 01 15:16:12 crc kubenswrapper[4931]: I1201 15:16:12.235563 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89b2e60278cc41d32668857578a52105f55991957036a5f17910d00b3b359b6b" Dec 01 15:16:12 crc kubenswrapper[4931]: I1201 15:16:12.235085 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2" Dec 01 15:16:14 crc kubenswrapper[4931]: I1201 15:16:14.561344 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-654ffbd64b-qsbsr"] Dec 01 15:16:14 crc kubenswrapper[4931]: E1201 15:16:14.561607 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08eaf36-78a8-4183-b663-22eaefe5cb6b" containerName="extract" Dec 01 15:16:14 crc kubenswrapper[4931]: I1201 15:16:14.561621 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08eaf36-78a8-4183-b663-22eaefe5cb6b" containerName="extract" Dec 01 15:16:14 crc kubenswrapper[4931]: E1201 15:16:14.561635 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08eaf36-78a8-4183-b663-22eaefe5cb6b" containerName="pull" Dec 01 15:16:14 crc kubenswrapper[4931]: I1201 15:16:14.561641 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08eaf36-78a8-4183-b663-22eaefe5cb6b" containerName="pull" Dec 01 15:16:14 crc kubenswrapper[4931]: E1201 15:16:14.561662 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08eaf36-78a8-4183-b663-22eaefe5cb6b" containerName="util" Dec 01 15:16:14 crc kubenswrapper[4931]: I1201 15:16:14.561668 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08eaf36-78a8-4183-b663-22eaefe5cb6b" containerName="util" Dec 01 15:16:14 crc kubenswrapper[4931]: I1201 15:16:14.561770 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08eaf36-78a8-4183-b663-22eaefe5cb6b" containerName="extract" Dec 01 15:16:14 crc kubenswrapper[4931]: I1201 15:16:14.562188 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-654ffbd64b-qsbsr" Dec 01 15:16:14 crc kubenswrapper[4931]: I1201 15:16:14.566336 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-wshnp" Dec 01 15:16:14 crc kubenswrapper[4931]: I1201 15:16:14.608021 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-654ffbd64b-qsbsr"] Dec 01 15:16:14 crc kubenswrapper[4931]: I1201 15:16:14.697262 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwp4t\" (UniqueName: \"kubernetes.io/projected/a34fc488-c895-4f50-9164-ced702fcf61d-kube-api-access-lwp4t\") pod \"openstack-operator-controller-operator-654ffbd64b-qsbsr\" (UID: \"a34fc488-c895-4f50-9164-ced702fcf61d\") " pod="openstack-operators/openstack-operator-controller-operator-654ffbd64b-qsbsr" Dec 01 15:16:14 crc kubenswrapper[4931]: I1201 15:16:14.799369 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwp4t\" (UniqueName: \"kubernetes.io/projected/a34fc488-c895-4f50-9164-ced702fcf61d-kube-api-access-lwp4t\") pod \"openstack-operator-controller-operator-654ffbd64b-qsbsr\" (UID: \"a34fc488-c895-4f50-9164-ced702fcf61d\") " pod="openstack-operators/openstack-operator-controller-operator-654ffbd64b-qsbsr" Dec 01 15:16:14 crc kubenswrapper[4931]: I1201 15:16:14.844218 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwp4t\" (UniqueName: \"kubernetes.io/projected/a34fc488-c895-4f50-9164-ced702fcf61d-kube-api-access-lwp4t\") pod \"openstack-operator-controller-operator-654ffbd64b-qsbsr\" (UID: \"a34fc488-c895-4f50-9164-ced702fcf61d\") " pod="openstack-operators/openstack-operator-controller-operator-654ffbd64b-qsbsr" Dec 01 15:16:14 crc kubenswrapper[4931]: I1201 15:16:14.882602 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-654ffbd64b-qsbsr" Dec 01 15:16:15 crc kubenswrapper[4931]: I1201 15:16:15.319167 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-654ffbd64b-qsbsr"] Dec 01 15:16:15 crc kubenswrapper[4931]: W1201 15:16:15.323994 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda34fc488_c895_4f50_9164_ced702fcf61d.slice/crio-5397326bdc26ef52d60f42d3d2966ef8a4ba4b8fe6f2730f5119bd6c96c229f8 WatchSource:0}: Error finding container 5397326bdc26ef52d60f42d3d2966ef8a4ba4b8fe6f2730f5119bd6c96c229f8: Status 404 returned error can't find the container with id 5397326bdc26ef52d60f42d3d2966ef8a4ba4b8fe6f2730f5119bd6c96c229f8 Dec 01 15:16:16 crc kubenswrapper[4931]: I1201 15:16:16.266823 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-654ffbd64b-qsbsr" event={"ID":"a34fc488-c895-4f50-9164-ced702fcf61d","Type":"ContainerStarted","Data":"5397326bdc26ef52d60f42d3d2966ef8a4ba4b8fe6f2730f5119bd6c96c229f8"} Dec 01 15:16:21 crc kubenswrapper[4931]: I1201 15:16:21.334493 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-654ffbd64b-qsbsr" event={"ID":"a34fc488-c895-4f50-9164-ced702fcf61d","Type":"ContainerStarted","Data":"991e3b87447d83716870647dbccc2df23b4412439eb19ead00310657263aa8e4"} Dec 01 15:16:21 crc kubenswrapper[4931]: I1201 15:16:21.335489 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-654ffbd64b-qsbsr" Dec 01 15:16:21 crc kubenswrapper[4931]: I1201 15:16:21.392694 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-654ffbd64b-qsbsr" podStartSLOduration=2.401621205 podStartE2EDuration="7.392662791s" podCreationTimestamp="2025-12-01 15:16:14 +0000 UTC" firstStartedPulling="2025-12-01 15:16:15.3266688 +0000 UTC m=+921.752542467" lastFinishedPulling="2025-12-01 15:16:20.317710386 +0000 UTC m=+926.743584053" observedRunningTime="2025-12-01 15:16:21.366753174 +0000 UTC m=+927.792626891" watchObservedRunningTime="2025-12-01 15:16:21.392662791 +0000 UTC m=+927.818536478" Dec 01 15:16:34 crc kubenswrapper[4931]: I1201 15:16:34.886823 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-654ffbd64b-qsbsr" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.492959 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.495492 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.498718 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4fkvx" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.524596 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.532240 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.533510 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.538713 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-2lxr5" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.546071 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.547526 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.558143 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.562020 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-fcwhv" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.569269 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.570724 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.575475 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-94fz5" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.580085 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps4w6\" (UniqueName: \"kubernetes.io/projected/71d6d824-1cf3-4984-b8de-f10d19192a5f-kube-api-access-ps4w6\") pod \"barbican-operator-controller-manager-7d9dfd778-wcntp\" (UID: \"71d6d824-1cf3-4984-b8de-f10d19192a5f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.585532 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.590618 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.616814 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.618188 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.622875 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-h6sht" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.623102 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.660906 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.662036 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.670359 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-w7r77" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.682276 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpf7g\" (UniqueName: \"kubernetes.io/projected/006dbf05-46c7-4348-a9bb-74a7c56fd3fd-kube-api-access-zpf7g\") pod \"glance-operator-controller-manager-668d9c48b9-g8vfm\" (UID: \"006dbf05-46c7-4348-a9bb-74a7c56fd3fd\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.682433 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps4w6\" (UniqueName: \"kubernetes.io/projected/71d6d824-1cf3-4984-b8de-f10d19192a5f-kube-api-access-ps4w6\") pod \"barbican-operator-controller-manager-7d9dfd778-wcntp\" (UID: \"71d6d824-1cf3-4984-b8de-f10d19192a5f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.682459 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ndnv\" (UniqueName: \"kubernetes.io/projected/8258a972-ead1-4bee-ae4f-cba90b238dde-kube-api-access-7ndnv\") pod \"designate-operator-controller-manager-78b4bc895b-g2jqf\" (UID: \"8258a972-ead1-4bee-ae4f-cba90b238dde\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.682494 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwj2f\" (UniqueName: \"kubernetes.io/projected/fff7f112-af9f-42e3-beef-e0efdcf602c9-kube-api-access-jwj2f\") pod \"cinder-operator-controller-manager-859b6ccc6-lr4mz\" (UID: \"fff7f112-af9f-42e3-beef-e0efdcf602c9\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.688800 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.704939 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.706261 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.711325 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-n95bt" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.711454 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.712839 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.713376 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.720845 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-kkdgd" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.744005 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.751830 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps4w6\" (UniqueName: \"kubernetes.io/projected/71d6d824-1cf3-4984-b8de-f10d19192a5f-kube-api-access-ps4w6\") pod \"barbican-operator-controller-manager-7d9dfd778-wcntp\" (UID: \"71d6d824-1cf3-4984-b8de-f10d19192a5f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.760153 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.783735 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzqms\" (UniqueName: \"kubernetes.io/projected/6b36c7c7-1886-476e-b0d8-50168c04ff83-kube-api-access-vzqms\") pod \"horizon-operator-controller-manager-68c6d99b8f-6x5bb\" (UID: \"6b36c7c7-1886-476e-b0d8-50168c04ff83\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.783819 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ndnv\" (UniqueName: \"kubernetes.io/projected/8258a972-ead1-4bee-ae4f-cba90b238dde-kube-api-access-7ndnv\") pod \"designate-operator-controller-manager-78b4bc895b-g2jqf\" (UID: \"8258a972-ead1-4bee-ae4f-cba90b238dde\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.783865 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwj2f\" (UniqueName: \"kubernetes.io/projected/fff7f112-af9f-42e3-beef-e0efdcf602c9-kube-api-access-jwj2f\") pod \"cinder-operator-controller-manager-859b6ccc6-lr4mz\" (UID: \"fff7f112-af9f-42e3-beef-e0efdcf602c9\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.783886 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6lvl\" (UniqueName: \"kubernetes.io/projected/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-kube-api-access-h6lvl\") pod \"infra-operator-controller-manager-57548d458d-sk9bq\" (UID: \"371eef0f-aae7-40bd-9c47-1ffd0e77e08d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.783913 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz8gp\" (UniqueName: \"kubernetes.io/projected/921c2e7b-0f37-4e93-ab2e-76a23e146d28-kube-api-access-hz8gp\") pod \"heat-operator-controller-manager-5f64f6f8bb-2fp96\" (UID: \"921c2e7b-0f37-4e93-ab2e-76a23e146d28\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.783941 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert\") pod \"infra-operator-controller-manager-57548d458d-sk9bq\" (UID: \"371eef0f-aae7-40bd-9c47-1ffd0e77e08d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.783966 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpf7g\" (UniqueName: \"kubernetes.io/projected/006dbf05-46c7-4348-a9bb-74a7c56fd3fd-kube-api-access-zpf7g\") pod \"glance-operator-controller-manager-668d9c48b9-g8vfm\" (UID: \"006dbf05-46c7-4348-a9bb-74a7c56fd3fd\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.808501 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.809995 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.824050 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fvnf2" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.826988 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.830047 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ndnv\" (UniqueName: \"kubernetes.io/projected/8258a972-ead1-4bee-ae4f-cba90b238dde-kube-api-access-7ndnv\") pod \"designate-operator-controller-manager-78b4bc895b-g2jqf\" (UID: \"8258a972-ead1-4bee-ae4f-cba90b238dde\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.832262 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpf7g\" (UniqueName: \"kubernetes.io/projected/006dbf05-46c7-4348-a9bb-74a7c56fd3fd-kube-api-access-zpf7g\") pod \"glance-operator-controller-manager-668d9c48b9-g8vfm\" (UID: \"006dbf05-46c7-4348-a9bb-74a7c56fd3fd\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.841717 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwj2f\" (UniqueName: \"kubernetes.io/projected/fff7f112-af9f-42e3-beef-e0efdcf602c9-kube-api-access-jwj2f\") pod \"cinder-operator-controller-manager-859b6ccc6-lr4mz\" (UID: \"fff7f112-af9f-42e3-beef-e0efdcf602c9\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.852114 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.856322 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.871615 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.877146 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.879899 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.887642 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-twklz" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.889918 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj8zq\" (UniqueName: \"kubernetes.io/projected/e20c64cb-88d5-4ffe-bb88-8715010ccf33-kube-api-access-tj8zq\") pod \"ironic-operator-controller-manager-6c548fd776-gzpgd\" (UID: \"e20c64cb-88d5-4ffe-bb88-8715010ccf33\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.890043 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzqms\" (UniqueName: \"kubernetes.io/projected/6b36c7c7-1886-476e-b0d8-50168c04ff83-kube-api-access-vzqms\") pod \"horizon-operator-controller-manager-68c6d99b8f-6x5bb\" (UID: \"6b36c7c7-1886-476e-b0d8-50168c04ff83\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.890174 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5hsr\" (UniqueName: \"kubernetes.io/projected/56ca76b6-8e16-4d73-9e78-f20e046738fc-kube-api-access-p5hsr\") pod \"keystone-operator-controller-manager-546d4bdf48-l24dv\" (UID: \"56ca76b6-8e16-4d73-9e78-f20e046738fc\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.890256 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6lvl\" (UniqueName: \"kubernetes.io/projected/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-kube-api-access-h6lvl\") pod \"infra-operator-controller-manager-57548d458d-sk9bq\" (UID: \"371eef0f-aae7-40bd-9c47-1ffd0e77e08d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.890341 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz8gp\" (UniqueName: \"kubernetes.io/projected/921c2e7b-0f37-4e93-ab2e-76a23e146d28-kube-api-access-hz8gp\") pod \"heat-operator-controller-manager-5f64f6f8bb-2fp96\" (UID: \"921c2e7b-0f37-4e93-ab2e-76a23e146d28\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.890448 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert\") pod \"infra-operator-controller-manager-57548d458d-sk9bq\" (UID: \"371eef0f-aae7-40bd-9c47-1ffd0e77e08d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:16:54 crc kubenswrapper[4931]: E1201 15:16:54.890649 4931 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 15:16:54 crc kubenswrapper[4931]: E1201 15:16:54.890756 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert podName:371eef0f-aae7-40bd-9c47-1ffd0e77e08d nodeName:}" failed. No retries permitted until 2025-12-01 15:16:55.390737544 +0000 UTC m=+961.816611201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert") pod "infra-operator-controller-manager-57548d458d-sk9bq" (UID: "371eef0f-aae7-40bd-9c47-1ffd0e77e08d") : secret "infra-operator-webhook-server-cert" not found Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.894060 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.898777 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb"] Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.953521 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz8gp\" (UniqueName: \"kubernetes.io/projected/921c2e7b-0f37-4e93-ab2e-76a23e146d28-kube-api-access-hz8gp\") pod \"heat-operator-controller-manager-5f64f6f8bb-2fp96\" (UID: \"921c2e7b-0f37-4e93-ab2e-76a23e146d28\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.959120 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzqms\" (UniqueName: \"kubernetes.io/projected/6b36c7c7-1886-476e-b0d8-50168c04ff83-kube-api-access-vzqms\") pod \"horizon-operator-controller-manager-68c6d99b8f-6x5bb\" (UID: \"6b36c7c7-1886-476e-b0d8-50168c04ff83\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb" Dec 01 15:16:54 crc kubenswrapper[4931]: I1201 15:16:54.992906 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6lvl\" (UniqueName: \"kubernetes.io/projected/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-kube-api-access-h6lvl\") pod \"infra-operator-controller-manager-57548d458d-sk9bq\" (UID: \"371eef0f-aae7-40bd-9c47-1ffd0e77e08d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.008311 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj8zq\" (UniqueName: \"kubernetes.io/projected/e20c64cb-88d5-4ffe-bb88-8715010ccf33-kube-api-access-tj8zq\") pod \"ironic-operator-controller-manager-6c548fd776-gzpgd\" (UID: \"e20c64cb-88d5-4ffe-bb88-8715010ccf33\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.008441 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.008523 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmczw\" (UniqueName: \"kubernetes.io/projected/1c0a3def-dea0-40f3-8368-a36f6030f7f7-kube-api-access-tmczw\") pod \"manila-operator-controller-manager-6546668bfd-bpqmb\" (UID: \"1c0a3def-dea0-40f3-8368-a36f6030f7f7\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.008718 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5hsr\" (UniqueName: \"kubernetes.io/projected/56ca76b6-8e16-4d73-9e78-f20e046738fc-kube-api-access-p5hsr\") pod \"keystone-operator-controller-manager-546d4bdf48-l24dv\" (UID: \"56ca76b6-8e16-4d73-9e78-f20e046738fc\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.017032 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.050926 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj8zq\" (UniqueName: \"kubernetes.io/projected/e20c64cb-88d5-4ffe-bb88-8715010ccf33-kube-api-access-tj8zq\") pod \"ironic-operator-controller-manager-6c548fd776-gzpgd\" (UID: \"e20c64cb-88d5-4ffe-bb88-8715010ccf33\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.052262 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.093181 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-z7msn" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.105573 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5hsr\" (UniqueName: \"kubernetes.io/projected/56ca76b6-8e16-4d73-9e78-f20e046738fc-kube-api-access-p5hsr\") pod \"keystone-operator-controller-manager-546d4bdf48-l24dv\" (UID: \"56ca76b6-8e16-4d73-9e78-f20e046738fc\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.109344 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.119078 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmczw\" (UniqueName: \"kubernetes.io/projected/1c0a3def-dea0-40f3-8368-a36f6030f7f7-kube-api-access-tmczw\") pod \"manila-operator-controller-manager-6546668bfd-bpqmb\" (UID: \"1c0a3def-dea0-40f3-8368-a36f6030f7f7\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.150050 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.153777 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.159141 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-td57n" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.162523 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.163030 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmczw\" (UniqueName: \"kubernetes.io/projected/1c0a3def-dea0-40f3-8368-a36f6030f7f7-kube-api-access-tmczw\") pod \"manila-operator-controller-manager-6546668bfd-bpqmb\" (UID: \"1c0a3def-dea0-40f3-8368-a36f6030f7f7\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.220042 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.238092 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv58z\" (UniqueName: \"kubernetes.io/projected/e32d4db7-aa36-4724-a113-2f7ff2af254d-kube-api-access-dv58z\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-jvkpz\" (UID: \"e32d4db7-aa36-4724-a113-2f7ff2af254d\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.238221 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4kml\" (UniqueName: \"kubernetes.io/projected/b30439fb-2d71-4c3b-97ec-5e304c1eb15e-kube-api-access-r4kml\") pod \"mariadb-operator-controller-manager-56bbcc9d85-ms6t7\" (UID: \"b30439fb-2d71-4c3b-97ec-5e304c1eb15e\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.238401 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.239931 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.241536 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.244133 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-6nhxh" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.253987 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.274515 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.276279 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.277219 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.292462 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.305296 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-9hrqr" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.305659 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.306794 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.313139 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-sc5j7" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.313348 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.339813 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv58z\" (UniqueName: \"kubernetes.io/projected/e32d4db7-aa36-4724-a113-2f7ff2af254d-kube-api-access-dv58z\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-jvkpz\" (UID: \"e32d4db7-aa36-4724-a113-2f7ff2af254d\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.339923 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqnws\" (UniqueName: \"kubernetes.io/projected/3dc1d698-63f8-4e4b-8e28-6e128b5b46da-kube-api-access-tqnws\") pod \"nova-operator-controller-manager-697bc559fc-h5c6g\" (UID: \"3dc1d698-63f8-4e4b-8e28-6e128b5b46da\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.339963 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd82q\" (UniqueName: \"kubernetes.io/projected/b366d143-b330-4141-be76-b87796b94301-kube-api-access-fd82q\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc\" (UID: \"b366d143-b330-4141-be76-b87796b94301\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.339991 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc\" (UID: \"b366d143-b330-4141-be76-b87796b94301\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.340018 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4kml\" (UniqueName: \"kubernetes.io/projected/b30439fb-2d71-4c3b-97ec-5e304c1eb15e-kube-api-access-r4kml\") pod \"mariadb-operator-controller-manager-56bbcc9d85-ms6t7\" (UID: \"b30439fb-2d71-4c3b-97ec-5e304c1eb15e\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.340053 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd642\" (UniqueName: \"kubernetes.io/projected/e45f587b-3120-4b16-9c9a-66bc5c1252aa-kube-api-access-gd642\") pod \"octavia-operator-controller-manager-998648c74-mtpcz\" (UID: \"e45f587b-3120-4b16-9c9a-66bc5c1252aa\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.356085 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.390637 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv58z\" (UniqueName: \"kubernetes.io/projected/e32d4db7-aa36-4724-a113-2f7ff2af254d-kube-api-access-dv58z\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-jvkpz\" (UID: \"e32d4db7-aa36-4724-a113-2f7ff2af254d\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.397539 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4kml\" (UniqueName: \"kubernetes.io/projected/b30439fb-2d71-4c3b-97ec-5e304c1eb15e-kube-api-access-r4kml\") pod \"mariadb-operator-controller-manager-56bbcc9d85-ms6t7\" (UID: \"b30439fb-2d71-4c3b-97ec-5e304c1eb15e\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.407446 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.408905 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.422402 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-6twcs" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.425569 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.427267 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.433796 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-knw82" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.444563 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd642\" (UniqueName: \"kubernetes.io/projected/e45f587b-3120-4b16-9c9a-66bc5c1252aa-kube-api-access-gd642\") pod \"octavia-operator-controller-manager-998648c74-mtpcz\" (UID: \"e45f587b-3120-4b16-9c9a-66bc5c1252aa\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.444911 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmd7q\" (UniqueName: \"kubernetes.io/projected/a35a63a2-123b-45f4-99fe-4a7baece61be-kube-api-access-cmd7q\") pod \"placement-operator-controller-manager-78f8948974-n4w4t\" (UID: \"a35a63a2-123b-45f4-99fe-4a7baece61be\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.445051 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert\") pod \"infra-operator-controller-manager-57548d458d-sk9bq\" (UID: \"371eef0f-aae7-40bd-9c47-1ffd0e77e08d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.445139 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqnws\" (UniqueName: \"kubernetes.io/projected/3dc1d698-63f8-4e4b-8e28-6e128b5b46da-kube-api-access-tqnws\") pod \"nova-operator-controller-manager-697bc559fc-h5c6g\" (UID: \"3dc1d698-63f8-4e4b-8e28-6e128b5b46da\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.445239 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd82q\" (UniqueName: \"kubernetes.io/projected/b366d143-b330-4141-be76-b87796b94301-kube-api-access-fd82q\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc\" (UID: \"b366d143-b330-4141-be76-b87796b94301\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:16:55 crc kubenswrapper[4931]: E1201 15:16:55.445305 4931 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 15:16:55 crc kubenswrapper[4931]: E1201 15:16:55.445419 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert podName:371eef0f-aae7-40bd-9c47-1ffd0e77e08d nodeName:}" failed. No retries permitted until 2025-12-01 15:16:56.445374606 +0000 UTC m=+962.871248273 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert") pod "infra-operator-controller-manager-57548d458d-sk9bq" (UID: "371eef0f-aae7-40bd-9c47-1ffd0e77e08d") : secret "infra-operator-webhook-server-cert" not found Dec 01 15:16:55 crc kubenswrapper[4931]: E1201 15:16:55.445598 4931 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 15:16:55 crc kubenswrapper[4931]: E1201 15:16:55.445716 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert podName:b366d143-b330-4141-be76-b87796b94301 nodeName:}" failed. No retries permitted until 2025-12-01 15:16:55.945697585 +0000 UTC m=+962.371571242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" (UID: "b366d143-b330-4141-be76-b87796b94301") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.445331 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc\" (UID: \"b366d143-b330-4141-be76-b87796b94301\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.448346 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.459519 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.470674 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd642\" (UniqueName: \"kubernetes.io/projected/e45f587b-3120-4b16-9c9a-66bc5c1252aa-kube-api-access-gd642\") pod \"octavia-operator-controller-manager-998648c74-mtpcz\" (UID: \"e45f587b-3120-4b16-9c9a-66bc5c1252aa\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.482341 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqnws\" (UniqueName: \"kubernetes.io/projected/3dc1d698-63f8-4e4b-8e28-6e128b5b46da-kube-api-access-tqnws\") pod \"nova-operator-controller-manager-697bc559fc-h5c6g\" (UID: \"3dc1d698-63f8-4e4b-8e28-6e128b5b46da\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.488336 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd82q\" (UniqueName: \"kubernetes.io/projected/b366d143-b330-4141-be76-b87796b94301-kube-api-access-fd82q\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc\" (UID: \"b366d143-b330-4141-be76-b87796b94301\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.502532 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.504510 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.519748 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-2r42l" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.521506 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.531991 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.543006 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.546975 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmd7q\" (UniqueName: \"kubernetes.io/projected/a35a63a2-123b-45f4-99fe-4a7baece61be-kube-api-access-cmd7q\") pod \"placement-operator-controller-manager-78f8948974-n4w4t\" (UID: \"a35a63a2-123b-45f4-99fe-4a7baece61be\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.547043 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtm8c\" (UniqueName: \"kubernetes.io/projected/31166fda-e2fe-4a4a-9717-550172ed4093-kube-api-access-wtm8c\") pod \"ovn-operator-controller-manager-b6456fdb6-g9s8h\" (UID: \"31166fda-e2fe-4a4a-9717-550172ed4093\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.574427 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.575686 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.590670 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-6q7fg" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.591495 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.596168 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.597846 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.603519 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-nz4hx" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.603982 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.612060 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.613787 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmd7q\" (UniqueName: \"kubernetes.io/projected/a35a63a2-123b-45f4-99fe-4a7baece61be-kube-api-access-cmd7q\") pod \"placement-operator-controller-manager-78f8948974-n4w4t\" (UID: \"a35a63a2-123b-45f4-99fe-4a7baece61be\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.640016 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.648929 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.650066 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtnhb\" (UniqueName: \"kubernetes.io/projected/12e80a83-7d9f-417b-934e-83d23085f11b-kube-api-access-wtnhb\") pod \"swift-operator-controller-manager-5f8c65bbfc-vv2tp\" (UID: \"12e80a83-7d9f-417b-934e-83d23085f11b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.650823 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtm8c\" (UniqueName: \"kubernetes.io/projected/31166fda-e2fe-4a4a-9717-550172ed4093-kube-api-access-wtm8c\") pod \"ovn-operator-controller-manager-b6456fdb6-g9s8h\" (UID: \"31166fda-e2fe-4a4a-9717-550172ed4093\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.650534 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.653494 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-lgf7f" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.667569 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.685923 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.687203 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtm8c\" (UniqueName: \"kubernetes.io/projected/31166fda-e2fe-4a4a-9717-550172ed4093-kube-api-access-wtm8c\") pod \"ovn-operator-controller-manager-b6456fdb6-g9s8h\" (UID: \"31166fda-e2fe-4a4a-9717-550172ed4093\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.748879 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.750972 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.756304 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.756544 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.756987 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-2mg2w" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.758860 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrf9h\" (UniqueName: \"kubernetes.io/projected/75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae-kube-api-access-wrf9h\") pod \"test-operator-controller-manager-5854674fcc-lsxmf\" (UID: \"75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.759007 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnhb\" (UniqueName: \"kubernetes.io/projected/12e80a83-7d9f-417b-934e-83d23085f11b-kube-api-access-wtnhb\") pod \"swift-operator-controller-manager-5f8c65bbfc-vv2tp\" (UID: \"12e80a83-7d9f-417b-934e-83d23085f11b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.760292 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2k6n\" (UniqueName: \"kubernetes.io/projected/39f1253b-afe7-47b1-8c68-2a36d49f969b-kube-api-access-w2k6n\") pod \"telemetry-operator-controller-manager-76cc84c6bb-rv87n\" (UID: \"39f1253b-afe7-47b1-8c68-2a36d49f969b\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.760341 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t66b5\" (UniqueName: \"kubernetes.io/projected/18b9fb30-a34f-42ad-9692-84c532f586d6-kube-api-access-t66b5\") pod \"watcher-operator-controller-manager-769dc69bc-6kbkb\" (UID: \"18b9fb30-a34f-42ad-9692-84c532f586d6\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.788598 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.789225 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtnhb\" (UniqueName: \"kubernetes.io/projected/12e80a83-7d9f-417b-934e-83d23085f11b-kube-api-access-wtnhb\") pod \"swift-operator-controller-manager-5f8c65bbfc-vv2tp\" (UID: \"12e80a83-7d9f-417b-934e-83d23085f11b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.832354 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.835960 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.837357 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.844090 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m"] Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.849379 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-mw8qn" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.862357 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.863133 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.863180 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-572d6\" (UniqueName: \"kubernetes.io/projected/803870f9-7602-4eae-ba61-09e7aa4c63bb-kube-api-access-572d6\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.863442 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.863472 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2k6n\" (UniqueName: \"kubernetes.io/projected/39f1253b-afe7-47b1-8c68-2a36d49f969b-kube-api-access-w2k6n\") pod \"telemetry-operator-controller-manager-76cc84c6bb-rv87n\" (UID: \"39f1253b-afe7-47b1-8c68-2a36d49f969b\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.863497 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t66b5\" (UniqueName: \"kubernetes.io/projected/18b9fb30-a34f-42ad-9692-84c532f586d6-kube-api-access-t66b5\") pod \"watcher-operator-controller-manager-769dc69bc-6kbkb\" (UID: \"18b9fb30-a34f-42ad-9692-84c532f586d6\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.863576 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrf9h\" (UniqueName: \"kubernetes.io/projected/75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae-kube-api-access-wrf9h\") pod \"test-operator-controller-manager-5854674fcc-lsxmf\" (UID: \"75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.893695 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t66b5\" (UniqueName: \"kubernetes.io/projected/18b9fb30-a34f-42ad-9692-84c532f586d6-kube-api-access-t66b5\") pod \"watcher-operator-controller-manager-769dc69bc-6kbkb\" (UID: \"18b9fb30-a34f-42ad-9692-84c532f586d6\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.894076 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrf9h\" (UniqueName: \"kubernetes.io/projected/75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae-kube-api-access-wrf9h\") pod \"test-operator-controller-manager-5854674fcc-lsxmf\" (UID: \"75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.917855 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2k6n\" (UniqueName: \"kubernetes.io/projected/39f1253b-afe7-47b1-8c68-2a36d49f969b-kube-api-access-w2k6n\") pod \"telemetry-operator-controller-manager-76cc84c6bb-rv87n\" (UID: \"39f1253b-afe7-47b1-8c68-2a36d49f969b\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.921648 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.967330 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc\" (UID: \"b366d143-b330-4141-be76-b87796b94301\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.967635 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.967730 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv2kk\" (UniqueName: \"kubernetes.io/projected/cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3-kube-api-access-xv2kk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-zbd9m\" (UID: \"cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.967819 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-572d6\" (UniqueName: \"kubernetes.io/projected/803870f9-7602-4eae-ba61-09e7aa4c63bb-kube-api-access-572d6\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:55 crc kubenswrapper[4931]: I1201 15:16:55.967921 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:55 crc kubenswrapper[4931]: E1201 15:16:55.968123 4931 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 15:16:55 crc kubenswrapper[4931]: E1201 15:16:55.968242 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs podName:803870f9-7602-4eae-ba61-09e7aa4c63bb nodeName:}" failed. No retries permitted until 2025-12-01 15:16:56.468222942 +0000 UTC m=+962.894096609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs") pod "openstack-operator-controller-manager-95d9848f7-bjcxf" (UID: "803870f9-7602-4eae-ba61-09e7aa4c63bb") : secret "webhook-server-cert" not found Dec 01 15:16:55 crc kubenswrapper[4931]: E1201 15:16:55.968464 4931 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 15:16:55 crc kubenswrapper[4931]: E1201 15:16:55.968554 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert podName:b366d143-b330-4141-be76-b87796b94301 nodeName:}" failed. No retries permitted until 2025-12-01 15:16:56.968523621 +0000 UTC m=+963.394397288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" (UID: "b366d143-b330-4141-be76-b87796b94301") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 15:16:55 crc kubenswrapper[4931]: E1201 15:16:55.968627 4931 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 15:16:55 crc kubenswrapper[4931]: E1201 15:16:55.968660 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs podName:803870f9-7602-4eae-ba61-09e7aa4c63bb nodeName:}" failed. No retries permitted until 2025-12-01 15:16:56.468648334 +0000 UTC m=+962.894522221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs") pod "openstack-operator-controller-manager-95d9848f7-bjcxf" (UID: "803870f9-7602-4eae-ba61-09e7aa4c63bb") : secret "metrics-server-cert" not found Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.006544 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-572d6\" (UniqueName: \"kubernetes.io/projected/803870f9-7602-4eae-ba61-09e7aa4c63bb-kube-api-access-572d6\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.072238 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv2kk\" (UniqueName: \"kubernetes.io/projected/cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3-kube-api-access-xv2kk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-zbd9m\" (UID: \"cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m" Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.093028 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv2kk\" (UniqueName: \"kubernetes.io/projected/cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3-kube-api-access-xv2kk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-zbd9m\" (UID: \"cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m" Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.137201 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.152186 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.161919 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz"] Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.176300 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.329867 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m" Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.479812 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.479876 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert\") pod \"infra-operator-controller-manager-57548d458d-sk9bq\" (UID: \"371eef0f-aae7-40bd-9c47-1ffd0e77e08d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.479963 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:56 crc kubenswrapper[4931]: E1201 15:16:56.480106 4931 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 15:16:56 crc kubenswrapper[4931]: E1201 15:16:56.480234 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs podName:803870f9-7602-4eae-ba61-09e7aa4c63bb nodeName:}" failed. No retries permitted until 2025-12-01 15:16:57.480207005 +0000 UTC m=+963.906080672 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs") pod "openstack-operator-controller-manager-95d9848f7-bjcxf" (UID: "803870f9-7602-4eae-ba61-09e7aa4c63bb") : secret "webhook-server-cert" not found Dec 01 15:16:56 crc kubenswrapper[4931]: E1201 15:16:56.480133 4931 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 15:16:56 crc kubenswrapper[4931]: E1201 15:16:56.480143 4931 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 15:16:56 crc kubenswrapper[4931]: E1201 15:16:56.480346 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs podName:803870f9-7602-4eae-ba61-09e7aa4c63bb nodeName:}" failed. No retries permitted until 2025-12-01 15:16:57.480312398 +0000 UTC m=+963.906186065 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs") pod "openstack-operator-controller-manager-95d9848f7-bjcxf" (UID: "803870f9-7602-4eae-ba61-09e7aa4c63bb") : secret "metrics-server-cert" not found Dec 01 15:16:56 crc kubenswrapper[4931]: E1201 15:16:56.480395 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert podName:371eef0f-aae7-40bd-9c47-1ffd0e77e08d nodeName:}" failed. No retries permitted until 2025-12-01 15:16:58.480358929 +0000 UTC m=+964.906232596 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert") pod "infra-operator-controller-manager-57548d458d-sk9bq" (UID: "371eef0f-aae7-40bd-9c47-1ffd0e77e08d") : secret "infra-operator-webhook-server-cert" not found Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.508998 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd"] Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.546437 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb"] Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.553191 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp"] Dec 01 15:16:56 crc kubenswrapper[4931]: W1201 15:16:56.570312 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71d6d824_1cf3_4984_b8de_f10d19192a5f.slice/crio-45376541f116e931cfa7776f10878f9de69e274ee0ec4340a25d18b9400e1921 WatchSource:0}: Error finding container 45376541f116e931cfa7776f10878f9de69e274ee0ec4340a25d18b9400e1921: Status 404 returned error can't find the container with id 45376541f116e931cfa7776f10878f9de69e274ee0ec4340a25d18b9400e1921 Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.571349 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf"] Dec 01 15:16:56 crc kubenswrapper[4931]: W1201 15:16:56.571862 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8258a972_ead1_4bee_ae4f_cba90b238dde.slice/crio-c1b19908f6ecef18b85ab588337c7ec85135ced2dce2b0e3df9e330c097947d6 WatchSource:0}: Error finding container c1b19908f6ecef18b85ab588337c7ec85135ced2dce2b0e3df9e330c097947d6: Status 404 returned error can't find the container with id c1b19908f6ecef18b85ab588337c7ec85135ced2dce2b0e3df9e330c097947d6 Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.578249 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm"] Dec 01 15:16:56 crc kubenswrapper[4931]: W1201 15:16:56.584599 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod006dbf05_46c7_4348_a9bb_74a7c56fd3fd.slice/crio-07f071939bc5da1eda2dad04dabc02bbc2166c83c4926ada52b42ab7e9045fe0 WatchSource:0}: Error finding container 07f071939bc5da1eda2dad04dabc02bbc2166c83c4926ada52b42ab7e9045fe0: Status 404 returned error can't find the container with id 07f071939bc5da1eda2dad04dabc02bbc2166c83c4926ada52b42ab7e9045fe0 Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.634308 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm" event={"ID":"006dbf05-46c7-4348-a9bb-74a7c56fd3fd","Type":"ContainerStarted","Data":"07f071939bc5da1eda2dad04dabc02bbc2166c83c4926ada52b42ab7e9045fe0"} Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.643067 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz" event={"ID":"fff7f112-af9f-42e3-beef-e0efdcf602c9","Type":"ContainerStarted","Data":"699057a80d2275d920130c8c5bb4d444b732918056d380340c142b939486ec1b"} Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.655260 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp" event={"ID":"71d6d824-1cf3-4984-b8de-f10d19192a5f","Type":"ContainerStarted","Data":"45376541f116e931cfa7776f10878f9de69e274ee0ec4340a25d18b9400e1921"} Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.656372 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf" event={"ID":"8258a972-ead1-4bee-ae4f-cba90b238dde","Type":"ContainerStarted","Data":"c1b19908f6ecef18b85ab588337c7ec85135ced2dce2b0e3df9e330c097947d6"} Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.658763 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd" event={"ID":"e20c64cb-88d5-4ffe-bb88-8715010ccf33","Type":"ContainerStarted","Data":"c0a96630d2fd53e218260178276189c2898b52a68f7c899d6e1a5eca31aa19a8"} Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.659698 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb" event={"ID":"6b36c7c7-1886-476e-b0d8-50168c04ff83","Type":"ContainerStarted","Data":"ad2852bd37fdf4db766f283c08386b1ce303de0a3eac91e89845aff19d72d584"} Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.866192 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7"] Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.884680 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96"] Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.917696 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz"] Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.922935 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz"] Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.928673 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h"] Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.935509 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb"] Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.943459 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv"] Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.951287 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp"] Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.960426 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t"] Dec 01 15:16:56 crc kubenswrapper[4931]: I1201 15:16:56.986797 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc\" (UID: \"b366d143-b330-4141-be76-b87796b94301\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:16:56 crc kubenswrapper[4931]: E1201 15:16:56.987146 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dv58z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-jvkpz_openstack-operators(e32d4db7-aa36-4724-a113-2f7ff2af254d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 15:16:56 crc kubenswrapper[4931]: E1201 15:16:56.987421 4931 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 15:16:56 crc kubenswrapper[4931]: E1201 15:16:56.987516 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert podName:b366d143-b330-4141-be76-b87796b94301 nodeName:}" failed. No retries permitted until 2025-12-01 15:16:58.987492753 +0000 UTC m=+965.413366420 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" (UID: "b366d143-b330-4141-be76-b87796b94301") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 15:16:56 crc kubenswrapper[4931]: E1201 15:16:56.989449 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dv58z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-jvkpz_openstack-operators(e32d4db7-aa36-4724-a113-2f7ff2af254d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 15:16:56 crc kubenswrapper[4931]: E1201 15:16:56.990943 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" podUID="e32d4db7-aa36-4724-a113-2f7ff2af254d" Dec 01 15:16:57 crc kubenswrapper[4931]: W1201 15:16:57.005815 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ca76b6_8e16_4d73_9e78_f20e046738fc.slice/crio-2adef51110ab77a1de07c5832ef07e5fe4dd4b83a4546a7e0e3aa11d39b114d2 WatchSource:0}: Error finding container 2adef51110ab77a1de07c5832ef07e5fe4dd4b83a4546a7e0e3aa11d39b114d2: Status 404 returned error can't find the container with id 2adef51110ab77a1de07c5832ef07e5fe4dd4b83a4546a7e0e3aa11d39b114d2 Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.011215 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p5hsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-l24dv_openstack-operators(56ca76b6-8e16-4d73-9e78-f20e046738fc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.022118 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p5hsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-l24dv_openstack-operators(56ca76b6-8e16-4d73-9e78-f20e046738fc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.024522 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" podUID="56ca76b6-8e16-4d73-9e78-f20e046738fc" Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.109078 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g"] Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.152201 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf"] Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.158316 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb"] Dec 01 15:16:57 crc kubenswrapper[4931]: W1201 15:16:57.183669 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18b9fb30_a34f_42ad_9692_84c532f586d6.slice/crio-aeb98ad442775893299d37165f3334ea4addcc06d8da0a5d812c00812d01cbbe WatchSource:0}: Error finding container aeb98ad442775893299d37165f3334ea4addcc06d8da0a5d812c00812d01cbbe: Status 404 returned error can't find the container with id aeb98ad442775893299d37165f3334ea4addcc06d8da0a5d812c00812d01cbbe Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.187298 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t66b5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-6kbkb_openstack-operators(18b9fb30-a34f-42ad-9692-84c532f586d6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 15:16:57 crc kubenswrapper[4931]: W1201 15:16:57.187736 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc03b8f7_9a1d_4f98_a70f_6da587e8d1d3.slice/crio-19463de39c5a5edf5044211c27e3f4e7c93877c6a03e654c2838528390da2c2a WatchSource:0}: Error finding container 19463de39c5a5edf5044211c27e3f4e7c93877c6a03e654c2838528390da2c2a: Status 404 returned error can't find the container with id 19463de39c5a5edf5044211c27e3f4e7c93877c6a03e654c2838528390da2c2a Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.191026 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t66b5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-6kbkb_openstack-operators(18b9fb30-a34f-42ad-9692-84c532f586d6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.191132 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n"] Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.192400 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" podUID="18b9fb30-a34f-42ad-9692-84c532f586d6" Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.193723 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xv2kk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-zbd9m_openstack-operators(cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.194973 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m" podUID="cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3" Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.197409 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w2k6n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-rv87n_openstack-operators(39f1253b-afe7-47b1-8c68-2a36d49f969b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.197442 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wrf9h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-lsxmf_openstack-operators(75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.199476 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w2k6n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-rv87n_openstack-operators(39f1253b-afe7-47b1-8c68-2a36d49f969b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.199652 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wrf9h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-lsxmf_openstack-operators(75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.200713 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" podUID="39f1253b-afe7-47b1-8c68-2a36d49f969b" Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.200823 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" podUID="75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae" Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.208828 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m"] Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.497834 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.497912 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.498081 4931 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.498143 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs podName:803870f9-7602-4eae-ba61-09e7aa4c63bb nodeName:}" failed. No retries permitted until 2025-12-01 15:16:59.498124428 +0000 UTC m=+965.923998095 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs") pod "openstack-operator-controller-manager-95d9848f7-bjcxf" (UID: "803870f9-7602-4eae-ba61-09e7aa4c63bb") : secret "webhook-server-cert" not found Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.498557 4931 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.498588 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs podName:803870f9-7602-4eae-ba61-09e7aa4c63bb nodeName:}" failed. No retries permitted until 2025-12-01 15:16:59.498578671 +0000 UTC m=+965.924452328 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs") pod "openstack-operator-controller-manager-95d9848f7-bjcxf" (UID: "803870f9-7602-4eae-ba61-09e7aa4c63bb") : secret "metrics-server-cert" not found Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.682197 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7" event={"ID":"b30439fb-2d71-4c3b-97ec-5e304c1eb15e","Type":"ContainerStarted","Data":"2da257c1802d3c00ed8d80446e0c4263b71fc1ee215b38bb46fea95c4c7b8b1a"} Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.684136 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb" event={"ID":"1c0a3def-dea0-40f3-8368-a36f6030f7f7","Type":"ContainerStarted","Data":"46a517c35d2073184e4d2dca9bf2ac1ad49265f90954682687544d8077d31443"} Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.686129 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t" event={"ID":"a35a63a2-123b-45f4-99fe-4a7baece61be","Type":"ContainerStarted","Data":"723f88b90ebb7aea50b59839513c18704844e5436203a2208e820203a9ecdb6f"} Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.687105 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g" event={"ID":"3dc1d698-63f8-4e4b-8e28-6e128b5b46da","Type":"ContainerStarted","Data":"62b17d458a719b74f957b352d34e70a18099ad717628fbb4b818296d747b8faf"} Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.688733 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz" event={"ID":"e45f587b-3120-4b16-9c9a-66bc5c1252aa","Type":"ContainerStarted","Data":"a30a1c8f2a18b866295a271620e1ac37b229916004a99511b783b2bf0c9d6fe0"} Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.689841 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" event={"ID":"e32d4db7-aa36-4724-a113-2f7ff2af254d","Type":"ContainerStarted","Data":"320330a5dae9f613cb84e3941ba7e5e492237428b205434eec287f2d95e0cb4c"} Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.692632 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" event={"ID":"75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae","Type":"ContainerStarted","Data":"17f3d3f50ec9065a63445f1052cab9679dc3d9460dd17bfecee62b68ec6865b0"} Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.694674 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" podUID="75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae" Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.697302 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" event={"ID":"39f1253b-afe7-47b1-8c68-2a36d49f969b","Type":"ContainerStarted","Data":"f1fadbe7907f98dec07b24cbebd5da9e5f8464650a9c84815e3f60c7e6e08a49"} Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.701542 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" event={"ID":"56ca76b6-8e16-4d73-9e78-f20e046738fc","Type":"ContainerStarted","Data":"2adef51110ab77a1de07c5832ef07e5fe4dd4b83a4546a7e0e3aa11d39b114d2"} Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.706597 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" podUID="39f1253b-afe7-47b1-8c68-2a36d49f969b" Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.707167 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp" event={"ID":"12e80a83-7d9f-417b-934e-83d23085f11b","Type":"ContainerStarted","Data":"960d8380d50358135fbd94561411f86656de73f9b6d3cd89584326f2ae72f379"} Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.710037 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" podUID="e32d4db7-aa36-4724-a113-2f7ff2af254d" Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.712459 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" podUID="56ca76b6-8e16-4d73-9e78-f20e046738fc" Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.716988 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m" event={"ID":"cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3","Type":"ContainerStarted","Data":"19463de39c5a5edf5044211c27e3f4e7c93877c6a03e654c2838528390da2c2a"} Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.732627 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m" podUID="cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3" Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.743443 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" event={"ID":"18b9fb30-a34f-42ad-9692-84c532f586d6","Type":"ContainerStarted","Data":"aeb98ad442775893299d37165f3334ea4addcc06d8da0a5d812c00812d01cbbe"} Dec 01 15:16:57 crc kubenswrapper[4931]: E1201 15:16:57.746652 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" podUID="18b9fb30-a34f-42ad-9692-84c532f586d6" Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.746839 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h" event={"ID":"31166fda-e2fe-4a4a-9717-550172ed4093","Type":"ContainerStarted","Data":"a141b6344fe5c2d3df698ca8be995d90d395cac4a79b77fd72327cf98f3345a5"} Dec 01 15:16:57 crc kubenswrapper[4931]: I1201 15:16:57.749437 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96" event={"ID":"921c2e7b-0f37-4e93-ab2e-76a23e146d28","Type":"ContainerStarted","Data":"edef77a9285744880f5c98817e56cad79b96df5dac384640d272e540f16167e1"} Dec 01 15:16:58 crc kubenswrapper[4931]: I1201 15:16:58.519259 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert\") pod \"infra-operator-controller-manager-57548d458d-sk9bq\" (UID: \"371eef0f-aae7-40bd-9c47-1ffd0e77e08d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:16:58 crc kubenswrapper[4931]: E1201 15:16:58.519855 4931 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 15:16:58 crc kubenswrapper[4931]: E1201 15:16:58.519922 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert podName:371eef0f-aae7-40bd-9c47-1ffd0e77e08d nodeName:}" failed. No retries permitted until 2025-12-01 15:17:02.519903152 +0000 UTC m=+968.945776819 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert") pod "infra-operator-controller-manager-57548d458d-sk9bq" (UID: "371eef0f-aae7-40bd-9c47-1ffd0e77e08d") : secret "infra-operator-webhook-server-cert" not found Dec 01 15:16:58 crc kubenswrapper[4931]: E1201 15:16:58.780099 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m" podUID="cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3" Dec 01 15:16:58 crc kubenswrapper[4931]: E1201 15:16:58.780515 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" podUID="18b9fb30-a34f-42ad-9692-84c532f586d6" Dec 01 15:16:58 crc kubenswrapper[4931]: E1201 15:16:58.780565 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" podUID="39f1253b-afe7-47b1-8c68-2a36d49f969b" Dec 01 15:16:58 crc kubenswrapper[4931]: E1201 15:16:58.780682 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" podUID="e32d4db7-aa36-4724-a113-2f7ff2af254d" Dec 01 15:16:58 crc kubenswrapper[4931]: E1201 15:16:58.780980 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" podUID="56ca76b6-8e16-4d73-9e78-f20e046738fc" Dec 01 15:16:58 crc kubenswrapper[4931]: E1201 15:16:58.796699 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" podUID="75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae" Dec 01 15:16:59 crc kubenswrapper[4931]: E1201 15:16:59.036512 4931 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 15:16:59 crc kubenswrapper[4931]: E1201 15:16:59.036621 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert podName:b366d143-b330-4141-be76-b87796b94301 nodeName:}" failed. No retries permitted until 2025-12-01 15:17:03.036595821 +0000 UTC m=+969.462469488 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" (UID: "b366d143-b330-4141-be76-b87796b94301") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 15:16:59 crc kubenswrapper[4931]: I1201 15:16:59.036232 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc\" (UID: \"b366d143-b330-4141-be76-b87796b94301\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:16:59 crc kubenswrapper[4931]: I1201 15:16:59.546971 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:59 crc kubenswrapper[4931]: E1201 15:16:59.547227 4931 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 15:16:59 crc kubenswrapper[4931]: E1201 15:16:59.547328 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs podName:803870f9-7602-4eae-ba61-09e7aa4c63bb nodeName:}" failed. No retries permitted until 2025-12-01 15:17:03.547306448 +0000 UTC m=+969.973180115 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs") pod "openstack-operator-controller-manager-95d9848f7-bjcxf" (UID: "803870f9-7602-4eae-ba61-09e7aa4c63bb") : secret "webhook-server-cert" not found Dec 01 15:16:59 crc kubenswrapper[4931]: I1201 15:16:59.547454 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:16:59 crc kubenswrapper[4931]: E1201 15:16:59.547655 4931 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 15:16:59 crc kubenswrapper[4931]: E1201 15:16:59.547677 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs podName:803870f9-7602-4eae-ba61-09e7aa4c63bb nodeName:}" failed. No retries permitted until 2025-12-01 15:17:03.547670618 +0000 UTC m=+969.973544285 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs") pod "openstack-operator-controller-manager-95d9848f7-bjcxf" (UID: "803870f9-7602-4eae-ba61-09e7aa4c63bb") : secret "metrics-server-cert" not found Dec 01 15:17:02 crc kubenswrapper[4931]: I1201 15:17:02.609017 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert\") pod \"infra-operator-controller-manager-57548d458d-sk9bq\" (UID: \"371eef0f-aae7-40bd-9c47-1ffd0e77e08d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:17:02 crc kubenswrapper[4931]: E1201 15:17:02.609267 4931 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 15:17:02 crc kubenswrapper[4931]: E1201 15:17:02.609855 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert podName:371eef0f-aae7-40bd-9c47-1ffd0e77e08d nodeName:}" failed. No retries permitted until 2025-12-01 15:17:10.609830728 +0000 UTC m=+977.035704395 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert") pod "infra-operator-controller-manager-57548d458d-sk9bq" (UID: "371eef0f-aae7-40bd-9c47-1ffd0e77e08d") : secret "infra-operator-webhook-server-cert" not found Dec 01 15:17:03 crc kubenswrapper[4931]: I1201 15:17:03.116589 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc\" (UID: \"b366d143-b330-4141-be76-b87796b94301\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:17:03 crc kubenswrapper[4931]: E1201 15:17:03.116888 4931 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 15:17:03 crc kubenswrapper[4931]: E1201 15:17:03.117023 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert podName:b366d143-b330-4141-be76-b87796b94301 nodeName:}" failed. No retries permitted until 2025-12-01 15:17:11.116989093 +0000 UTC m=+977.542862800 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" (UID: "b366d143-b330-4141-be76-b87796b94301") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 15:17:03 crc kubenswrapper[4931]: I1201 15:17:03.624117 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:17:03 crc kubenswrapper[4931]: I1201 15:17:03.624267 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:17:03 crc kubenswrapper[4931]: E1201 15:17:03.624334 4931 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 15:17:03 crc kubenswrapper[4931]: E1201 15:17:03.624479 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs podName:803870f9-7602-4eae-ba61-09e7aa4c63bb nodeName:}" failed. No retries permitted until 2025-12-01 15:17:11.624451855 +0000 UTC m=+978.050325532 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs") pod "openstack-operator-controller-manager-95d9848f7-bjcxf" (UID: "803870f9-7602-4eae-ba61-09e7aa4c63bb") : secret "webhook-server-cert" not found Dec 01 15:17:03 crc kubenswrapper[4931]: E1201 15:17:03.624483 4931 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 15:17:03 crc kubenswrapper[4931]: E1201 15:17:03.624559 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs podName:803870f9-7602-4eae-ba61-09e7aa4c63bb nodeName:}" failed. No retries permitted until 2025-12-01 15:17:11.624536267 +0000 UTC m=+978.050410104 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs") pod "openstack-operator-controller-manager-95d9848f7-bjcxf" (UID: "803870f9-7602-4eae-ba61-09e7aa4c63bb") : secret "metrics-server-cert" not found Dec 01 15:17:10 crc kubenswrapper[4931]: I1201 15:17:10.658290 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert\") pod \"infra-operator-controller-manager-57548d458d-sk9bq\" (UID: \"371eef0f-aae7-40bd-9c47-1ffd0e77e08d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:17:10 crc kubenswrapper[4931]: E1201 15:17:10.658770 4931 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 15:17:10 crc kubenswrapper[4931]: E1201 15:17:10.659636 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert podName:371eef0f-aae7-40bd-9c47-1ffd0e77e08d nodeName:}" failed. No retries permitted until 2025-12-01 15:17:26.659612752 +0000 UTC m=+993.085486419 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert") pod "infra-operator-controller-manager-57548d458d-sk9bq" (UID: "371eef0f-aae7-40bd-9c47-1ffd0e77e08d") : secret "infra-operator-webhook-server-cert" not found Dec 01 15:17:10 crc kubenswrapper[4931]: E1201 15:17:10.727716 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 01 15:17:10 crc kubenswrapper[4931]: E1201 15:17:10.727970 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vzqms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-6x5bb_openstack-operators(6b36c7c7-1886-476e-b0d8-50168c04ff83): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:17:11 crc kubenswrapper[4931]: I1201 15:17:11.171330 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc\" (UID: \"b366d143-b330-4141-be76-b87796b94301\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:17:11 crc kubenswrapper[4931]: I1201 15:17:11.179845 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b366d143-b330-4141-be76-b87796b94301-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc\" (UID: \"b366d143-b330-4141-be76-b87796b94301\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:17:11 crc kubenswrapper[4931]: I1201 15:17:11.302516 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:17:11 crc kubenswrapper[4931]: E1201 15:17:11.345961 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 01 15:17:11 crc kubenswrapper[4931]: E1201 15:17:11.346188 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tj8zq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-gzpgd_openstack-operators(e20c64cb-88d5-4ffe-bb88-8715010ccf33): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:17:11 crc kubenswrapper[4931]: I1201 15:17:11.687057 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:17:11 crc kubenswrapper[4931]: I1201 15:17:11.687600 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:17:11 crc kubenswrapper[4931]: I1201 15:17:11.694615 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-metrics-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:17:11 crc kubenswrapper[4931]: I1201 15:17:11.694703 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/803870f9-7602-4eae-ba61-09e7aa4c63bb-webhook-certs\") pod \"openstack-operator-controller-manager-95d9848f7-bjcxf\" (UID: \"803870f9-7602-4eae-ba61-09e7aa4c63bb\") " pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:17:11 crc kubenswrapper[4931]: I1201 15:17:11.797228 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:17:12 crc kubenswrapper[4931]: E1201 15:17:12.017732 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 01 15:17:12 crc kubenswrapper[4931]: E1201 15:17:12.018104 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cmd7q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-n4w4t_openstack-operators(a35a63a2-123b-45f4-99fe-4a7baece61be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:17:12 crc kubenswrapper[4931]: E1201 15:17:12.575095 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d" Dec 01 15:17:12 crc kubenswrapper[4931]: E1201 15:17:12.575319 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtnhb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-vv2tp_openstack-operators(12e80a83-7d9f-417b-934e-83d23085f11b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:17:13 crc kubenswrapper[4931]: E1201 15:17:13.189318 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 01 15:17:13 crc kubenswrapper[4931]: E1201 15:17:13.189816 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tqnws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-h5c6g_openstack-operators(3dc1d698-63f8-4e4b-8e28-6e128b5b46da): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:17:14 crc kubenswrapper[4931]: I1201 15:17:14.256401 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc"] Dec 01 15:17:14 crc kubenswrapper[4931]: I1201 15:17:14.407946 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf"] Dec 01 15:17:14 crc kubenswrapper[4931]: W1201 15:17:14.661693 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb366d143_b330_4141_be76_b87796b94301.slice/crio-b3a0fe182c43fe8bb701a2b1e983da5af89526c6b047f983289cfeda6927a0ec WatchSource:0}: Error finding container b3a0fe182c43fe8bb701a2b1e983da5af89526c6b047f983289cfeda6927a0ec: Status 404 returned error can't find the container with id b3a0fe182c43fe8bb701a2b1e983da5af89526c6b047f983289cfeda6927a0ec Dec 01 15:17:14 crc kubenswrapper[4931]: I1201 15:17:14.936368 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" event={"ID":"b366d143-b330-4141-be76-b87796b94301","Type":"ContainerStarted","Data":"b3a0fe182c43fe8bb701a2b1e983da5af89526c6b047f983289cfeda6927a0ec"} Dec 01 15:17:16 crc kubenswrapper[4931]: W1201 15:17:16.888344 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod803870f9_7602_4eae_ba61_09e7aa4c63bb.slice/crio-68ba9d36484550a01accb613125c6505872f6062df332b83a779f55f9b61e084 WatchSource:0}: Error finding container 68ba9d36484550a01accb613125c6505872f6062df332b83a779f55f9b61e084: Status 404 returned error can't find the container with id 68ba9d36484550a01accb613125c6505872f6062df332b83a779f55f9b61e084 Dec 01 15:17:16 crc kubenswrapper[4931]: I1201 15:17:16.954164 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" event={"ID":"803870f9-7602-4eae-ba61-09e7aa4c63bb","Type":"ContainerStarted","Data":"68ba9d36484550a01accb613125c6505872f6062df332b83a779f55f9b61e084"} Dec 01 15:17:19 crc kubenswrapper[4931]: I1201 15:17:19.872132 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:17:19 crc kubenswrapper[4931]: I1201 15:17:19.872960 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:17:20 crc kubenswrapper[4931]: I1201 15:17:20.988420 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm" event={"ID":"006dbf05-46c7-4348-a9bb-74a7c56fd3fd","Type":"ContainerStarted","Data":"227935d3155430288ae331f741f8ad333bd2606901af7122e7bf0d81f9f066c4"} Dec 01 15:17:20 crc kubenswrapper[4931]: I1201 15:17:20.990287 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz" event={"ID":"fff7f112-af9f-42e3-beef-e0efdcf602c9","Type":"ContainerStarted","Data":"0db1c7ac9ad2aab915f153d7f40f7bc413c38d8458727003edf26e68a2b040fd"} Dec 01 15:17:20 crc kubenswrapper[4931]: I1201 15:17:20.991941 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz" event={"ID":"e45f587b-3120-4b16-9c9a-66bc5c1252aa","Type":"ContainerStarted","Data":"51d69728af8f77ad93237c6a1539ae56315da5dfacdada3328904613530171bd"} Dec 01 15:17:20 crc kubenswrapper[4931]: I1201 15:17:20.993336 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" event={"ID":"803870f9-7602-4eae-ba61-09e7aa4c63bb","Type":"ContainerStarted","Data":"daf2a7eba2e5e7ae66fdf1819510daefda2486e39d8c9783f79c72d4cf0c6522"} Dec 01 15:17:20 crc kubenswrapper[4931]: I1201 15:17:20.993550 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:17:21 crc kubenswrapper[4931]: I1201 15:17:21.039641 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" podStartSLOduration=26.039603904 podStartE2EDuration="26.039603904s" podCreationTimestamp="2025-12-01 15:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:17:21.030880585 +0000 UTC m=+987.456754252" watchObservedRunningTime="2025-12-01 15:17:21.039603904 +0000 UTC m=+987.465477571" Dec 01 15:17:26 crc kubenswrapper[4931]: I1201 15:17:26.760785 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert\") pod \"infra-operator-controller-manager-57548d458d-sk9bq\" (UID: \"371eef0f-aae7-40bd-9c47-1ffd0e77e08d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:17:26 crc kubenswrapper[4931]: I1201 15:17:26.768343 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/371eef0f-aae7-40bd-9c47-1ffd0e77e08d-cert\") pod \"infra-operator-controller-manager-57548d458d-sk9bq\" (UID: \"371eef0f-aae7-40bd-9c47-1ffd0e77e08d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:17:26 crc kubenswrapper[4931]: I1201 15:17:26.875483 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:17:30 crc kubenswrapper[4931]: I1201 15:17:30.073292 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h" event={"ID":"31166fda-e2fe-4a4a-9717-550172ed4093","Type":"ContainerStarted","Data":"5031c0738e0aec7f073737a2d7b75713142af49da60195c980f607335bc5a5d3"} Dec 01 15:17:30 crc kubenswrapper[4931]: I1201 15:17:30.076150 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96" event={"ID":"921c2e7b-0f37-4e93-ab2e-76a23e146d28","Type":"ContainerStarted","Data":"fa45f4ce4305af66947aa0546000c3f32e56dbf44e3a97e4e942b8ba16181610"} Dec 01 15:17:31 crc kubenswrapper[4931]: I1201 15:17:31.087824 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb" event={"ID":"1c0a3def-dea0-40f3-8368-a36f6030f7f7","Type":"ContainerStarted","Data":"40449d08346893cbabb8b8448ea775be78e8f6597c14a5b6d41c66a907aac53f"} Dec 01 15:17:31 crc kubenswrapper[4931]: I1201 15:17:31.089652 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf" event={"ID":"8258a972-ead1-4bee-ae4f-cba90b238dde","Type":"ContainerStarted","Data":"074714ca66640e77951fad1c254747d0b4124a39a1163a0414db2cb71a33c148"} Dec 01 15:17:31 crc kubenswrapper[4931]: I1201 15:17:31.092008 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp" event={"ID":"71d6d824-1cf3-4984-b8de-f10d19192a5f","Type":"ContainerStarted","Data":"6e71709d39eb66ec522997b0e6576078ab3d03cf12495335fcc0cc3d20fd6044"} Dec 01 15:17:31 crc kubenswrapper[4931]: I1201 15:17:31.094074 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7" event={"ID":"b30439fb-2d71-4c3b-97ec-5e304c1eb15e","Type":"ContainerStarted","Data":"3f4fed386c17fc03627279eda76e57cb5486d33a6827795bcb6536149bacd30a"} Dec 01 15:17:31 crc kubenswrapper[4931]: I1201 15:17:31.804649 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-95d9848f7-bjcxf" Dec 01 15:17:33 crc kubenswrapper[4931]: I1201 15:17:33.109431 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" event={"ID":"39f1253b-afe7-47b1-8c68-2a36d49f969b","Type":"ContainerStarted","Data":"510853c9f8fed6a2a2d0a71ce71b5f0195514ee8c43e670f90e26d59fd038cb2"} Dec 01 15:17:33 crc kubenswrapper[4931]: I1201 15:17:33.123687 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" event={"ID":"e32d4db7-aa36-4724-a113-2f7ff2af254d","Type":"ContainerStarted","Data":"5796615a813b95a84f493a44bf2219b73f0077dab4167628650bb8abe2398d19"} Dec 01 15:17:33 crc kubenswrapper[4931]: I1201 15:17:33.609935 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq"] Dec 01 15:17:34 crc kubenswrapper[4931]: E1201 15:17:34.039670 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 01 15:17:34 crc kubenswrapper[4931]: E1201 15:17:34.040463 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vzqms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-6x5bb_openstack-operators(6b36c7c7-1886-476e-b0d8-50168c04ff83): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:17:34 crc kubenswrapper[4931]: E1201 15:17:34.041708 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb" podUID="6b36c7c7-1886-476e-b0d8-50168c04ff83" Dec 01 15:17:34 crc kubenswrapper[4931]: E1201 15:17:34.052345 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 01 15:17:34 crc kubenswrapper[4931]: E1201 15:17:34.052609 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cmd7q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-n4w4t_openstack-operators(a35a63a2-123b-45f4-99fe-4a7baece61be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:17:34 crc kubenswrapper[4931]: E1201 15:17:34.053828 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t" podUID="a35a63a2-123b-45f4-99fe-4a7baece61be" Dec 01 15:17:34 crc kubenswrapper[4931]: E1201 15:17:34.072842 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 01 15:17:34 crc kubenswrapper[4931]: E1201 15:17:34.073080 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtnhb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-vv2tp_openstack-operators(12e80a83-7d9f-417b-934e-83d23085f11b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:17:34 crc kubenswrapper[4931]: E1201 15:17:34.074622 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp" podUID="12e80a83-7d9f-417b-934e-83d23085f11b" Dec 01 15:17:34 crc kubenswrapper[4931]: I1201 15:17:34.138964 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" event={"ID":"371eef0f-aae7-40bd-9c47-1ffd0e77e08d","Type":"ContainerStarted","Data":"bc9d82904144b9d351dfcebd8178baa8b0701473908b03868e3b262aef9cb507"} Dec 01 15:17:34 crc kubenswrapper[4931]: I1201 15:17:34.142553 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" event={"ID":"75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae","Type":"ContainerStarted","Data":"f464d2f299aa5d4c4d32da8a27b88d99f3a0991ee30f9984f27b953155ae8296"} Dec 01 15:17:34 crc kubenswrapper[4931]: I1201 15:17:34.145456 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" event={"ID":"18b9fb30-a34f-42ad-9692-84c532f586d6","Type":"ContainerStarted","Data":"d9f102d25ecc78dddebd860f4ef23b6364636c0552686c6b509c76dd7538035a"} Dec 01 15:17:34 crc kubenswrapper[4931]: I1201 15:17:34.146772 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" event={"ID":"56ca76b6-8e16-4d73-9e78-f20e046738fc","Type":"ContainerStarted","Data":"609a33a9fe6cc0e873c0ff0b949edda635d0f6517d2152a082cfb9a7a424e534"} Dec 01 15:17:34 crc kubenswrapper[4931]: I1201 15:17:34.149505 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m" event={"ID":"cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3","Type":"ContainerStarted","Data":"e8d85b841671078fa3c5ebf51e75b05fc8c8e897cd5c624cb8c57e878705770b"} Dec 01 15:17:34 crc kubenswrapper[4931]: I1201 15:17:34.235027 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-zbd9m" podStartSLOduration=17.301801464 podStartE2EDuration="39.234995318s" podCreationTimestamp="2025-12-01 15:16:55 +0000 UTC" firstStartedPulling="2025-12-01 15:16:57.193613759 +0000 UTC m=+963.619487426" lastFinishedPulling="2025-12-01 15:17:19.126807603 +0000 UTC m=+985.552681280" observedRunningTime="2025-12-01 15:17:34.227928936 +0000 UTC m=+1000.653802603" watchObservedRunningTime="2025-12-01 15:17:34.234995318 +0000 UTC m=+1000.660868985" Dec 01 15:17:34 crc kubenswrapper[4931]: E1201 15:17:34.247246 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 01 15:17:34 crc kubenswrapper[4931]: E1201 15:17:34.247463 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tj8zq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-gzpgd_openstack-operators(e20c64cb-88d5-4ffe-bb88-8715010ccf33): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:17:34 crc kubenswrapper[4931]: E1201 15:17:34.249205 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd" podUID="e20c64cb-88d5-4ffe-bb88-8715010ccf33" Dec 01 15:17:35 crc kubenswrapper[4931]: E1201 15:17:35.128249 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g" podUID="3dc1d698-63f8-4e4b-8e28-6e128b5b46da" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.173315 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h" event={"ID":"31166fda-e2fe-4a4a-9717-550172ed4093","Type":"ContainerStarted","Data":"79415e402dc47e9265cd6b1ac44a590c6afc8b1e1f3c30a9289369397a595913"} Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.174157 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.177600 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.179597 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm" event={"ID":"006dbf05-46c7-4348-a9bb-74a7c56fd3fd","Type":"ContainerStarted","Data":"1b9f3a62eeaaada0fa29420fbc3e9989803f8c82177160ea16965d77950f86ba"} Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.180514 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.185523 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.201221 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz" event={"ID":"fff7f112-af9f-42e3-beef-e0efdcf602c9","Type":"ContainerStarted","Data":"96dc5dfd64fcb0eec2650e24f295231b7b712592e2479aaaadec0aa56f5e0d2f"} Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.201950 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.203105 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-g9s8h" podStartSLOduration=3.767860004 podStartE2EDuration="41.203086667s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.947305106 +0000 UTC m=+963.373178773" lastFinishedPulling="2025-12-01 15:17:34.382531769 +0000 UTC m=+1000.808405436" observedRunningTime="2025-12-01 15:17:35.201231074 +0000 UTC m=+1001.627104741" watchObservedRunningTime="2025-12-01 15:17:35.203086667 +0000 UTC m=+1001.628960334" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.206276 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz" event={"ID":"e45f587b-3120-4b16-9c9a-66bc5c1252aa","Type":"ContainerStarted","Data":"276e09c91ee971ac6a8861ff06070e469240732ac0d4c01fdfc8b05b376c0bbe"} Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.206654 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.207466 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.208514 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g" event={"ID":"3dc1d698-63f8-4e4b-8e28-6e128b5b46da","Type":"ContainerStarted","Data":"0953ca60e2112744e2a122218fd113341a0e2e734194734c6c6684bfb9be665c"} Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.210190 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.223695 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" event={"ID":"b366d143-b330-4141-be76-b87796b94301","Type":"ContainerStarted","Data":"22f99737fe2fc83412e50d3b696d2ba187bba47cb4fe4381b0ff51ad2fbe18d6"} Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.223768 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" event={"ID":"b366d143-b330-4141-be76-b87796b94301","Type":"ContainerStarted","Data":"c08e22f30f751ee38977a6d692759417d887cc1c88c95e561d4804f915629a35"} Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.224643 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.261501 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" event={"ID":"e32d4db7-aa36-4724-a113-2f7ff2af254d","Type":"ContainerStarted","Data":"6328caccff41f57d8cc045fe3bef2f0b5e1c251cda57934eeb997d96eb08df3f"} Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.261595 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.286035 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-g8vfm" podStartSLOduration=3.608831276 podStartE2EDuration="41.286015364s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.595526596 +0000 UTC m=+963.021400273" lastFinishedPulling="2025-12-01 15:17:34.272710694 +0000 UTC m=+1000.698584361" observedRunningTime="2025-12-01 15:17:35.28484323 +0000 UTC m=+1001.710716907" watchObservedRunningTime="2025-12-01 15:17:35.286015364 +0000 UTC m=+1001.711889031" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.295451 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" event={"ID":"75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae","Type":"ContainerStarted","Data":"4f4c97b96c54102264baabc91fd5c4e41fb01ac5155844d0523d0f669b21095a"} Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.295596 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.314121 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf" event={"ID":"8258a972-ead1-4bee-ae4f-cba90b238dde","Type":"ContainerStarted","Data":"1e0f2b9072f5b32525af529b825baa6597cea1c23631537d656719b5409e3702"} Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.314580 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.332545 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.345852 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96" event={"ID":"921c2e7b-0f37-4e93-ab2e-76a23e146d28","Type":"ContainerStarted","Data":"5cb77d7c19284d4699c5e0f460c403e2fb2143bf566f226d76af3839c505419c"} Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.346258 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.358202 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" podStartSLOduration=35.644758413 podStartE2EDuration="41.358184024s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:17:14.663362767 +0000 UTC m=+981.089236434" lastFinishedPulling="2025-12-01 15:17:20.376788378 +0000 UTC m=+986.802662045" observedRunningTime="2025-12-01 15:17:35.34858943 +0000 UTC m=+1001.774463097" watchObservedRunningTime="2025-12-01 15:17:35.358184024 +0000 UTC m=+1001.784057691" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.370874 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.411585 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mtpcz" podStartSLOduration=4.025013772 podStartE2EDuration="41.411559097s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.941532249 +0000 UTC m=+963.367405916" lastFinishedPulling="2025-12-01 15:17:34.328077584 +0000 UTC m=+1000.753951241" observedRunningTime="2025-12-01 15:17:35.410744144 +0000 UTC m=+1001.836617811" watchObservedRunningTime="2025-12-01 15:17:35.411559097 +0000 UTC m=+1001.837432764" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.474441 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" podStartSLOduration=4.134944281 podStartE2EDuration="41.474413701s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.986937617 +0000 UTC m=+963.412811284" lastFinishedPulling="2025-12-01 15:17:34.326407037 +0000 UTC m=+1000.752280704" observedRunningTime="2025-12-01 15:17:35.447939115 +0000 UTC m=+1001.873812782" watchObservedRunningTime="2025-12-01 15:17:35.474413701 +0000 UTC m=+1001.900287368" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.474929 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-lr4mz" podStartSLOduration=3.420027265 podStartE2EDuration="41.474924685s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.196691781 +0000 UTC m=+962.622565448" lastFinishedPulling="2025-12-01 15:17:34.251589201 +0000 UTC m=+1000.677462868" observedRunningTime="2025-12-01 15:17:35.47403529 +0000 UTC m=+1001.899908967" watchObservedRunningTime="2025-12-01 15:17:35.474924685 +0000 UTC m=+1001.900798352" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.541863 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-g2jqf" podStartSLOduration=3.847660937 podStartE2EDuration="41.541838275s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.577061454 +0000 UTC m=+963.002935121" lastFinishedPulling="2025-12-01 15:17:34.271238792 +0000 UTC m=+1000.697112459" observedRunningTime="2025-12-01 15:17:35.525016375 +0000 UTC m=+1001.950890042" watchObservedRunningTime="2025-12-01 15:17:35.541838275 +0000 UTC m=+1001.967711932" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.570545 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" podStartSLOduration=4.393466964 podStartE2EDuration="41.570522594s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:57.197260484 +0000 UTC m=+963.623134151" lastFinishedPulling="2025-12-01 15:17:34.374316124 +0000 UTC m=+1000.800189781" observedRunningTime="2025-12-01 15:17:35.566482138 +0000 UTC m=+1001.992355805" watchObservedRunningTime="2025-12-01 15:17:35.570522594 +0000 UTC m=+1001.996396261" Dec 01 15:17:35 crc kubenswrapper[4931]: I1201 15:17:35.597097 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-2fp96" podStartSLOduration=4.271785606 podStartE2EDuration="41.597067031s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.947712117 +0000 UTC m=+963.373585784" lastFinishedPulling="2025-12-01 15:17:34.272993542 +0000 UTC m=+1000.698867209" observedRunningTime="2025-12-01 15:17:35.593059427 +0000 UTC m=+1002.018933104" watchObservedRunningTime="2025-12-01 15:17:35.597067031 +0000 UTC m=+1002.022940698" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.364090 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g" event={"ID":"3dc1d698-63f8-4e4b-8e28-6e128b5b46da","Type":"ContainerStarted","Data":"52a1352a7d6bfeb4235dc1aa676f99aed0c998ed028de50bb76e60af147c7d46"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.364504 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.367284 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb" event={"ID":"1c0a3def-dea0-40f3-8368-a36f6030f7f7","Type":"ContainerStarted","Data":"094d5f50457086422f2330a9c25299dd61d43a0e44dd25122d9fec2eb25be3f6"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.367442 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.370045 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" event={"ID":"18b9fb30-a34f-42ad-9692-84c532f586d6","Type":"ContainerStarted","Data":"0eb9b40c20c237fc618fcb06287b6c07caf15b3fe53919706f94410e8eea6630"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.370122 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.371083 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.384853 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t" event={"ID":"a35a63a2-123b-45f4-99fe-4a7baece61be","Type":"ContainerStarted","Data":"1716d98c7e65b3f96fa39fff0139e2d35e400df7758d1ef6a2cfdbea3a373447"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.384912 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t" event={"ID":"a35a63a2-123b-45f4-99fe-4a7baece61be","Type":"ContainerStarted","Data":"88d6b8888a6656012fbdfa7e16e2e8a70972333f0c00942b2ca0d6824fda1b1c"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.385393 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.391715 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g" podStartSLOduration=3.613766615 podStartE2EDuration="42.391693388s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:57.136880905 +0000 UTC m=+963.562754572" lastFinishedPulling="2025-12-01 15:17:35.914807668 +0000 UTC m=+1002.340681345" observedRunningTime="2025-12-01 15:17:36.38440277 +0000 UTC m=+1002.810276427" watchObservedRunningTime="2025-12-01 15:17:36.391693388 +0000 UTC m=+1002.817567055" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.399476 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd" event={"ID":"e20c64cb-88d5-4ffe-bb88-8715010ccf33","Type":"ContainerStarted","Data":"53b55e105273bdd3eea462d7c8b4140f5c51bb286b88a40c67fe23ca90586b5d"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.399680 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd" event={"ID":"e20c64cb-88d5-4ffe-bb88-8715010ccf33","Type":"ContainerStarted","Data":"6a0f76f4bc71de42534299a414cb057274bc7b44fdde7edd9905896e137550a6"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.399899 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.402231 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" event={"ID":"39f1253b-afe7-47b1-8c68-2a36d49f969b","Type":"ContainerStarted","Data":"4cdb46aaa3abc43f087f9c5cee4c63b292954d6e231160ac1c706ad623578a0c"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.403530 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.405511 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp" event={"ID":"12e80a83-7d9f-417b-934e-83d23085f11b","Type":"ContainerStarted","Data":"fadfe72a26d3918c5fdd530c53aa4d2ab8db4999e376584e22cbc3c9c1b04590"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.405605 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp" event={"ID":"12e80a83-7d9f-417b-934e-83d23085f11b","Type":"ContainerStarted","Data":"85f9f93e0877295f194197300d1aecbfed75170fc585afc939ca77e233b0a09b"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.405882 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.412924 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" podStartSLOduration=3.7610326 podStartE2EDuration="41.412912634s" podCreationTimestamp="2025-12-01 15:16:55 +0000 UTC" firstStartedPulling="2025-12-01 15:16:57.187158813 +0000 UTC m=+963.613032480" lastFinishedPulling="2025-12-01 15:17:34.839038847 +0000 UTC m=+1001.264912514" observedRunningTime="2025-12-01 15:17:36.411441832 +0000 UTC m=+1002.837315509" watchObservedRunningTime="2025-12-01 15:17:36.412912634 +0000 UTC m=+1002.838786301" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.415638 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7" event={"ID":"b30439fb-2d71-4c3b-97ec-5e304c1eb15e","Type":"ContainerStarted","Data":"5d08d16545dff0ff9dcd323ff999d56d66b06c768061e20cd294128ec6ba3881"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.417044 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.422790 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.436733 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" event={"ID":"56ca76b6-8e16-4d73-9e78-f20e046738fc","Type":"ContainerStarted","Data":"019eb3edc3e3085aa0a12829d3d097a2bd89e50dd8826b29040e4ed0e1fd6b0d"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.437592 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.440201 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-bpqmb" podStartSLOduration=5.065174832 podStartE2EDuration="42.440185702s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.962513444 +0000 UTC m=+963.388387111" lastFinishedPulling="2025-12-01 15:17:34.337524314 +0000 UTC m=+1000.763397981" observedRunningTime="2025-12-01 15:17:36.438623848 +0000 UTC m=+1002.864497525" watchObservedRunningTime="2025-12-01 15:17:36.440185702 +0000 UTC m=+1002.866059389" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.441941 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp" event={"ID":"71d6d824-1cf3-4984-b8de-f10d19192a5f","Type":"ContainerStarted","Data":"08f9431eb73351ae997d5c064f5cfd1381473f0a7fdf01c7bd823ab3daa1db0d"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.443304 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.458270 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.469944 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb" event={"ID":"6b36c7c7-1886-476e-b0d8-50168c04ff83","Type":"ContainerStarted","Data":"526575ab39be2065a35727135b8fae6e1947114f889924196439dc46f48cd12d"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.469989 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb" event={"ID":"6b36c7c7-1886-476e-b0d8-50168c04ff83","Type":"ContainerStarted","Data":"e558dd81d3d74e82104cc299900140dabac0f69b6ed787e863f2f06bc8b75f60"} Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.470318 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.504269 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp" podStartSLOduration=4.358026356 podStartE2EDuration="42.50423967s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.986852845 +0000 UTC m=+963.412726512" lastFinishedPulling="2025-12-01 15:17:35.133066159 +0000 UTC m=+1001.558939826" observedRunningTime="2025-12-01 15:17:36.496251693 +0000 UTC m=+1002.922125360" watchObservedRunningTime="2025-12-01 15:17:36.50423967 +0000 UTC m=+1002.930113338" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.533178 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ms6t7" podStartSLOduration=5.124082355 podStartE2EDuration="42.533156466s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.929538434 +0000 UTC m=+963.355412101" lastFinishedPulling="2025-12-01 15:17:34.338612545 +0000 UTC m=+1000.764486212" observedRunningTime="2025-12-01 15:17:36.53295769 +0000 UTC m=+1002.958831357" watchObservedRunningTime="2025-12-01 15:17:36.533156466 +0000 UTC m=+1002.959030133" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.581475 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd" podStartSLOduration=3.268617044 podStartE2EDuration="42.581453914s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.514103961 +0000 UTC m=+962.939977628" lastFinishedPulling="2025-12-01 15:17:35.826940831 +0000 UTC m=+1002.252814498" observedRunningTime="2025-12-01 15:17:36.577327846 +0000 UTC m=+1003.003201533" watchObservedRunningTime="2025-12-01 15:17:36.581453914 +0000 UTC m=+1003.007327581" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.602656 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" podStartSLOduration=5.447151244 podStartE2EDuration="42.602636499s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:57.197284934 +0000 UTC m=+963.623158601" lastFinishedPulling="2025-12-01 15:17:34.352770189 +0000 UTC m=+1000.778643856" observedRunningTime="2025-12-01 15:17:36.600326013 +0000 UTC m=+1003.026199680" watchObservedRunningTime="2025-12-01 15:17:36.602636499 +0000 UTC m=+1003.028510166" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.626220 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" podStartSLOduration=5.01843996 podStartE2EDuration="42.626201571s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:57.010938848 +0000 UTC m=+963.436812515" lastFinishedPulling="2025-12-01 15:17:34.618700459 +0000 UTC m=+1001.044574126" observedRunningTime="2025-12-01 15:17:36.622762003 +0000 UTC m=+1003.048635670" watchObservedRunningTime="2025-12-01 15:17:36.626201571 +0000 UTC m=+1003.052075228" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.674700 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t" podStartSLOduration=4.747275263 podStartE2EDuration="42.674678465s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.975126037 +0000 UTC m=+963.400999704" lastFinishedPulling="2025-12-01 15:17:34.902529239 +0000 UTC m=+1001.328402906" observedRunningTime="2025-12-01 15:17:36.664858804 +0000 UTC m=+1003.090732461" watchObservedRunningTime="2025-12-01 15:17:36.674678465 +0000 UTC m=+1003.100552132" Dec 01 15:17:36 crc kubenswrapper[4931]: I1201 15:17:36.687199 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-wcntp" podStartSLOduration=4.851080512 podStartE2EDuration="42.687166101s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.57414041 +0000 UTC m=+963.000014077" lastFinishedPulling="2025-12-01 15:17:34.410225999 +0000 UTC m=+1000.836099666" observedRunningTime="2025-12-01 15:17:36.684645429 +0000 UTC m=+1003.110519106" watchObservedRunningTime="2025-12-01 15:17:36.687166101 +0000 UTC m=+1003.113039768" Dec 01 15:17:37 crc kubenswrapper[4931]: I1201 15:17:37.485445 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-rv87n" Dec 01 15:17:37 crc kubenswrapper[4931]: I1201 15:17:37.519448 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb" podStartSLOduration=5.160559367 podStartE2EDuration="43.519352712s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:16:56.543562769 +0000 UTC m=+962.969436426" lastFinishedPulling="2025-12-01 15:17:34.902356104 +0000 UTC m=+1001.328229771" observedRunningTime="2025-12-01 15:17:36.72882967 +0000 UTC m=+1003.154703357" watchObservedRunningTime="2025-12-01 15:17:37.519352712 +0000 UTC m=+1003.945226379" Dec 01 15:17:38 crc kubenswrapper[4931]: I1201 15:17:38.486646 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" event={"ID":"371eef0f-aae7-40bd-9c47-1ffd0e77e08d","Type":"ContainerStarted","Data":"ce947389d25f7bafe18043ace4e8450d2145da1f1eee1c72a15eaa68fa4a77c9"} Dec 01 15:17:38 crc kubenswrapper[4931]: I1201 15:17:38.489078 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-l24dv" Dec 01 15:17:39 crc kubenswrapper[4931]: I1201 15:17:39.498251 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" event={"ID":"371eef0f-aae7-40bd-9c47-1ffd0e77e08d","Type":"ContainerStarted","Data":"240377bb9e062d7fc65b6f789b0788cf6d9b5974836a23c0743fa53c59bbd976"} Dec 01 15:17:39 crc kubenswrapper[4931]: I1201 15:17:39.498643 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:17:39 crc kubenswrapper[4931]: I1201 15:17:39.549566 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" podStartSLOduration=41.440843302 podStartE2EDuration="45.549523492s" podCreationTimestamp="2025-12-01 15:16:54 +0000 UTC" firstStartedPulling="2025-12-01 15:17:34.049861124 +0000 UTC m=+1000.475734791" lastFinishedPulling="2025-12-01 15:17:38.158541284 +0000 UTC m=+1004.584414981" observedRunningTime="2025-12-01 15:17:39.544336554 +0000 UTC m=+1005.970210261" watchObservedRunningTime="2025-12-01 15:17:39.549523492 +0000 UTC m=+1005.975397159" Dec 01 15:17:41 crc kubenswrapper[4931]: I1201 15:17:41.312884 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc" Dec 01 15:17:45 crc kubenswrapper[4931]: I1201 15:17:45.021022 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6x5bb" Dec 01 15:17:45 crc kubenswrapper[4931]: I1201 15:17:45.121941 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gzpgd" Dec 01 15:17:45 crc kubenswrapper[4931]: I1201 15:17:45.537570 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-jvkpz" Dec 01 15:17:45 crc kubenswrapper[4931]: I1201 15:17:45.645365 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-h5c6g" Dec 01 15:17:45 crc kubenswrapper[4931]: I1201 15:17:45.868794 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-n4w4t" Dec 01 15:17:45 crc kubenswrapper[4931]: I1201 15:17:45.928645 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vv2tp" Dec 01 15:17:46 crc kubenswrapper[4931]: I1201 15:17:46.156229 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-lsxmf" Dec 01 15:17:46 crc kubenswrapper[4931]: I1201 15:17:46.189023 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6kbkb" Dec 01 15:17:46 crc kubenswrapper[4931]: I1201 15:17:46.890128 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-sk9bq" Dec 01 15:17:49 crc kubenswrapper[4931]: I1201 15:17:49.872329 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:17:49 crc kubenswrapper[4931]: I1201 15:17:49.872927 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.293627 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zl7j9"] Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.296230 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zl7j9" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.302099 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.302219 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-r4xt2" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.304500 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.309339 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zl7j9"] Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.312722 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.345527 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9htv\" (UniqueName: \"kubernetes.io/projected/91033537-0105-4072-bc00-90be0d75b961-kube-api-access-m9htv\") pod \"dnsmasq-dns-675f4bcbfc-zl7j9\" (UID: \"91033537-0105-4072-bc00-90be0d75b961\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zl7j9" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.345615 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91033537-0105-4072-bc00-90be0d75b961-config\") pod \"dnsmasq-dns-675f4bcbfc-zl7j9\" (UID: \"91033537-0105-4072-bc00-90be0d75b961\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zl7j9" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.378036 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7lrkc"] Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.380394 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.389016 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.416516 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7lrkc"] Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.448469 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91033537-0105-4072-bc00-90be0d75b961-config\") pod \"dnsmasq-dns-675f4bcbfc-zl7j9\" (UID: \"91033537-0105-4072-bc00-90be0d75b961\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zl7j9" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.448538 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdwgp\" (UniqueName: \"kubernetes.io/projected/c27f51a4-79b4-482c-a9a7-f125c538edfc-kube-api-access-kdwgp\") pod \"dnsmasq-dns-78dd6ddcc-7lrkc\" (UID: \"c27f51a4-79b4-482c-a9a7-f125c538edfc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.448574 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27f51a4-79b4-482c-a9a7-f125c538edfc-config\") pod \"dnsmasq-dns-78dd6ddcc-7lrkc\" (UID: \"c27f51a4-79b4-482c-a9a7-f125c538edfc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.448650 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c27f51a4-79b4-482c-a9a7-f125c538edfc-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7lrkc\" (UID: \"c27f51a4-79b4-482c-a9a7-f125c538edfc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.448695 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9htv\" (UniqueName: \"kubernetes.io/projected/91033537-0105-4072-bc00-90be0d75b961-kube-api-access-m9htv\") pod \"dnsmasq-dns-675f4bcbfc-zl7j9\" (UID: \"91033537-0105-4072-bc00-90be0d75b961\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zl7j9" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.456877 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91033537-0105-4072-bc00-90be0d75b961-config\") pod \"dnsmasq-dns-675f4bcbfc-zl7j9\" (UID: \"91033537-0105-4072-bc00-90be0d75b961\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zl7j9" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.484152 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9htv\" (UniqueName: \"kubernetes.io/projected/91033537-0105-4072-bc00-90be0d75b961-kube-api-access-m9htv\") pod \"dnsmasq-dns-675f4bcbfc-zl7j9\" (UID: \"91033537-0105-4072-bc00-90be0d75b961\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zl7j9" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.549957 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c27f51a4-79b4-482c-a9a7-f125c538edfc-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7lrkc\" (UID: \"c27f51a4-79b4-482c-a9a7-f125c538edfc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.550144 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdwgp\" (UniqueName: \"kubernetes.io/projected/c27f51a4-79b4-482c-a9a7-f125c538edfc-kube-api-access-kdwgp\") pod \"dnsmasq-dns-78dd6ddcc-7lrkc\" (UID: \"c27f51a4-79b4-482c-a9a7-f125c538edfc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.550173 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27f51a4-79b4-482c-a9a7-f125c538edfc-config\") pod \"dnsmasq-dns-78dd6ddcc-7lrkc\" (UID: \"c27f51a4-79b4-482c-a9a7-f125c538edfc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.550835 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c27f51a4-79b4-482c-a9a7-f125c538edfc-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7lrkc\" (UID: \"c27f51a4-79b4-482c-a9a7-f125c538edfc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.551145 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27f51a4-79b4-482c-a9a7-f125c538edfc-config\") pod \"dnsmasq-dns-78dd6ddcc-7lrkc\" (UID: \"c27f51a4-79b4-482c-a9a7-f125c538edfc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.578676 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdwgp\" (UniqueName: \"kubernetes.io/projected/c27f51a4-79b4-482c-a9a7-f125c538edfc-kube-api-access-kdwgp\") pod \"dnsmasq-dns-78dd6ddcc-7lrkc\" (UID: \"c27f51a4-79b4-482c-a9a7-f125c538edfc\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.629754 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zl7j9" Dec 01 15:18:06 crc kubenswrapper[4931]: I1201 15:18:06.722752 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" Dec 01 15:18:07 crc kubenswrapper[4931]: I1201 15:18:07.238938 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zl7j9"] Dec 01 15:18:07 crc kubenswrapper[4931]: I1201 15:18:07.244473 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 15:18:07 crc kubenswrapper[4931]: I1201 15:18:07.294623 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7lrkc"] Dec 01 15:18:07 crc kubenswrapper[4931]: I1201 15:18:07.822266 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" event={"ID":"c27f51a4-79b4-482c-a9a7-f125c538edfc","Type":"ContainerStarted","Data":"e586833bbfb376522ee444d4032ce1f41cd18aa915b5088d799f880995d49350"} Dec 01 15:18:07 crc kubenswrapper[4931]: I1201 15:18:07.824134 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-zl7j9" event={"ID":"91033537-0105-4072-bc00-90be0d75b961","Type":"ContainerStarted","Data":"ec0396d8eae90bc58b8b0d40abce78e9e56e5c96e871c8a6d4a4dadfcc3224b5"} Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.337803 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zl7j9"] Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.357689 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-f58nx"] Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.358940 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-f58nx" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.372330 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-f58nx"] Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.408328 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkch7\" (UniqueName: \"kubernetes.io/projected/87514fe1-9d83-41dc-90fb-aae028a6bf34-kube-api-access-nkch7\") pod \"dnsmasq-dns-666b6646f7-f58nx\" (UID: \"87514fe1-9d83-41dc-90fb-aae028a6bf34\") " pod="openstack/dnsmasq-dns-666b6646f7-f58nx" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.408441 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87514fe1-9d83-41dc-90fb-aae028a6bf34-config\") pod \"dnsmasq-dns-666b6646f7-f58nx\" (UID: \"87514fe1-9d83-41dc-90fb-aae028a6bf34\") " pod="openstack/dnsmasq-dns-666b6646f7-f58nx" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.408472 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87514fe1-9d83-41dc-90fb-aae028a6bf34-dns-svc\") pod \"dnsmasq-dns-666b6646f7-f58nx\" (UID: \"87514fe1-9d83-41dc-90fb-aae028a6bf34\") " pod="openstack/dnsmasq-dns-666b6646f7-f58nx" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.512620 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkch7\" (UniqueName: \"kubernetes.io/projected/87514fe1-9d83-41dc-90fb-aae028a6bf34-kube-api-access-nkch7\") pod \"dnsmasq-dns-666b6646f7-f58nx\" (UID: \"87514fe1-9d83-41dc-90fb-aae028a6bf34\") " pod="openstack/dnsmasq-dns-666b6646f7-f58nx" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.512727 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87514fe1-9d83-41dc-90fb-aae028a6bf34-config\") pod \"dnsmasq-dns-666b6646f7-f58nx\" (UID: \"87514fe1-9d83-41dc-90fb-aae028a6bf34\") " pod="openstack/dnsmasq-dns-666b6646f7-f58nx" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.512769 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87514fe1-9d83-41dc-90fb-aae028a6bf34-dns-svc\") pod \"dnsmasq-dns-666b6646f7-f58nx\" (UID: \"87514fe1-9d83-41dc-90fb-aae028a6bf34\") " pod="openstack/dnsmasq-dns-666b6646f7-f58nx" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.513941 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87514fe1-9d83-41dc-90fb-aae028a6bf34-dns-svc\") pod \"dnsmasq-dns-666b6646f7-f58nx\" (UID: \"87514fe1-9d83-41dc-90fb-aae028a6bf34\") " pod="openstack/dnsmasq-dns-666b6646f7-f58nx" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.515335 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87514fe1-9d83-41dc-90fb-aae028a6bf34-config\") pod \"dnsmasq-dns-666b6646f7-f58nx\" (UID: \"87514fe1-9d83-41dc-90fb-aae028a6bf34\") " pod="openstack/dnsmasq-dns-666b6646f7-f58nx" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.563463 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkch7\" (UniqueName: \"kubernetes.io/projected/87514fe1-9d83-41dc-90fb-aae028a6bf34-kube-api-access-nkch7\") pod \"dnsmasq-dns-666b6646f7-f58nx\" (UID: \"87514fe1-9d83-41dc-90fb-aae028a6bf34\") " pod="openstack/dnsmasq-dns-666b6646f7-f58nx" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.638504 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7lrkc"] Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.664667 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sfgtv"] Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.667580 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.688829 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-f58nx" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.715797 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sfgtv"] Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.719187 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wwmn\" (UniqueName: \"kubernetes.io/projected/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-kube-api-access-8wwmn\") pod \"dnsmasq-dns-57d769cc4f-sfgtv\" (UID: \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\") " pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.719537 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-config\") pod \"dnsmasq-dns-57d769cc4f-sfgtv\" (UID: \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\") " pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.719612 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-sfgtv\" (UID: \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\") " pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.822301 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-sfgtv\" (UID: \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\") " pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.822419 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwmn\" (UniqueName: \"kubernetes.io/projected/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-kube-api-access-8wwmn\") pod \"dnsmasq-dns-57d769cc4f-sfgtv\" (UID: \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\") " pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.822475 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-config\") pod \"dnsmasq-dns-57d769cc4f-sfgtv\" (UID: \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\") " pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.823611 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-sfgtv\" (UID: \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\") " pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.824276 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-config\") pod \"dnsmasq-dns-57d769cc4f-sfgtv\" (UID: \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\") " pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.865448 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wwmn\" (UniqueName: \"kubernetes.io/projected/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-kube-api-access-8wwmn\") pod \"dnsmasq-dns-57d769cc4f-sfgtv\" (UID: \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\") " pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" Dec 01 15:18:09 crc kubenswrapper[4931]: I1201 15:18:09.994023 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.349619 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-f58nx"] Dec 01 15:18:10 crc kubenswrapper[4931]: W1201 15:18:10.374026 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87514fe1_9d83_41dc_90fb_aae028a6bf34.slice/crio-1d41ecf99b02b1044f7551f2e8d246eddba16590835b1bedf402866236bfea92 WatchSource:0}: Error finding container 1d41ecf99b02b1044f7551f2e8d246eddba16590835b1bedf402866236bfea92: Status 404 returned error can't find the container with id 1d41ecf99b02b1044f7551f2e8d246eddba16590835b1bedf402866236bfea92 Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.484373 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.485806 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.491157 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.491268 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.491155 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.491377 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-j6jsj" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.491866 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.504324 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.504794 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.504852 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.535023 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f8b18d2-d611-4ad6-850a-4ad19544c016-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.535075 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.535121 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-config-data\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.535148 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.535184 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv89w\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-kube-api-access-xv89w\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.535211 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.535241 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.535257 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.535302 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f8b18d2-d611-4ad6-850a-4ad19544c016-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.535320 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.535349 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.635193 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sfgtv"] Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.637333 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.637420 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.637444 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.637493 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f8b18d2-d611-4ad6-850a-4ad19544c016-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.637514 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.637545 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.637579 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f8b18d2-d611-4ad6-850a-4ad19544c016-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.637604 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.637631 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-config-data\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.637664 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.637698 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv89w\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-kube-api-access-xv89w\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.638005 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.638261 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.639329 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.639843 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-config-data\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.641093 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.643315 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: W1201 15:18:10.646412 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b75a3c4_3990_4ea8_afb0_bb60d15a8b50.slice/crio-305b43b75fb1212ffe77dec5f61c484a2f013b8658d9e26b06572e6fba7c128f WatchSource:0}: Error finding container 305b43b75fb1212ffe77dec5f61c484a2f013b8658d9e26b06572e6fba7c128f: Status 404 returned error can't find the container with id 305b43b75fb1212ffe77dec5f61c484a2f013b8658d9e26b06572e6fba7c128f Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.646944 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.647218 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f8b18d2-d611-4ad6-850a-4ad19544c016-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.647357 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f8b18d2-d611-4ad6-850a-4ad19544c016-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.651986 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.655933 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv89w\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-kube-api-access-xv89w\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.672766 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.813727 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.815835 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.819139 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.820865 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.823188 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.823194 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.823449 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.823558 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ztd2n" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.823569 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.824444 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.833102 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.879007 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-f58nx" event={"ID":"87514fe1-9d83-41dc-90fb-aae028a6bf34","Type":"ContainerStarted","Data":"1d41ecf99b02b1044f7551f2e8d246eddba16590835b1bedf402866236bfea92"} Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.881883 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" event={"ID":"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50","Type":"ContainerStarted","Data":"305b43b75fb1212ffe77dec5f61c484a2f013b8658d9e26b06572e6fba7c128f"} Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.945692 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.946037 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.946068 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.946105 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.946138 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.946157 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.946183 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.946203 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.946224 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrdkt\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-kube-api-access-jrdkt\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.946250 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:10 crc kubenswrapper[4931]: I1201 15:18:10.946341 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.047651 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.047695 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.047726 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.047745 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.047763 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrdkt\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-kube-api-access-jrdkt\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.047782 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.047856 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.047879 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.047900 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.047918 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.047969 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.048951 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.049981 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.051275 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.051671 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.052005 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.053216 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.053715 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.054360 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.060635 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.061363 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.076865 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrdkt\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-kube-api-access-jrdkt\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.102746 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.153875 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.419073 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 15:18:11 crc kubenswrapper[4931]: W1201 15:18:11.444148 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f8b18d2_d611_4ad6_850a_4ad19544c016.slice/crio-9b0f0536388d6ff5ac626711009a78962f7fcf589407a339a8862800618100b5 WatchSource:0}: Error finding container 9b0f0536388d6ff5ac626711009a78962f7fcf589407a339a8862800618100b5: Status 404 returned error can't find the container with id 9b0f0536388d6ff5ac626711009a78962f7fcf589407a339a8862800618100b5 Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.666889 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 15:18:11 crc kubenswrapper[4931]: W1201 15:18:11.686791 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda675ebc0_8c3b_4c43_884f_b32bd954ac6e.slice/crio-88d79ebedca822ce7a61d0f8db4e42deba1eeca895ace38b33f13e029d308ddb WatchSource:0}: Error finding container 88d79ebedca822ce7a61d0f8db4e42deba1eeca895ace38b33f13e029d308ddb: Status 404 returned error can't find the container with id 88d79ebedca822ce7a61d0f8db4e42deba1eeca895ace38b33f13e029d308ddb Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.898427 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a675ebc0-8c3b-4c43-884f-b32bd954ac6e","Type":"ContainerStarted","Data":"88d79ebedca822ce7a61d0f8db4e42deba1eeca895ace38b33f13e029d308ddb"} Dec 01 15:18:11 crc kubenswrapper[4931]: I1201 15:18:11.900559 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f8b18d2-d611-4ad6-850a-4ad19544c016","Type":"ContainerStarted","Data":"9b0f0536388d6ff5ac626711009a78962f7fcf589407a339a8862800618100b5"} Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.175061 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.180067 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.186038 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-vw9sg" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.186197 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.193083 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.200779 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.200378 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.217453 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.322916 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-config-data-default\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.323581 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5ctw\" (UniqueName: \"kubernetes.io/projected/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-kube-api-access-s5ctw\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.323659 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.323850 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-kolla-config\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.323884 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.323925 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.323983 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.324048 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.425605 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5ctw\" (UniqueName: \"kubernetes.io/projected/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-kube-api-access-s5ctw\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.425688 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.425785 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-kolla-config\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.425815 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.425854 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.425893 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.425918 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.425959 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-config-data-default\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.427190 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-config-data-default\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.428359 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-kolla-config\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.429299 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.430288 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.431631 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.437162 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.453614 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5ctw\" (UniqueName: \"kubernetes.io/projected/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-kube-api-access-s5ctw\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.458192 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.471417 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa\") " pod="openstack/openstack-galera-0" Dec 01 15:18:12 crc kubenswrapper[4931]: I1201 15:18:12.513260 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.109780 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.566857 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.568241 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.571562 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mc98p" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.573705 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.574002 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.574227 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.581817 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.754163 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf2acf20-9353-4a11-a09c-3d455a247303-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.754236 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cf2acf20-9353-4a11-a09c-3d455a247303-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.754285 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf2acf20-9353-4a11-a09c-3d455a247303-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.754328 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2acf20-9353-4a11-a09c-3d455a247303-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.754367 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf2acf20-9353-4a11-a09c-3d455a247303-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.754422 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk4kv\" (UniqueName: \"kubernetes.io/projected/cf2acf20-9353-4a11-a09c-3d455a247303-kube-api-access-qk4kv\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.754449 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cf2acf20-9353-4a11-a09c-3d455a247303-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.754497 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.856290 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2acf20-9353-4a11-a09c-3d455a247303-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.856369 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf2acf20-9353-4a11-a09c-3d455a247303-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.856434 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk4kv\" (UniqueName: \"kubernetes.io/projected/cf2acf20-9353-4a11-a09c-3d455a247303-kube-api-access-qk4kv\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.856465 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cf2acf20-9353-4a11-a09c-3d455a247303-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.856518 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.856557 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf2acf20-9353-4a11-a09c-3d455a247303-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.856580 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cf2acf20-9353-4a11-a09c-3d455a247303-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.856620 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf2acf20-9353-4a11-a09c-3d455a247303-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.857541 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf2acf20-9353-4a11-a09c-3d455a247303-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.857766 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cf2acf20-9353-4a11-a09c-3d455a247303-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.857914 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.859584 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf2acf20-9353-4a11-a09c-3d455a247303-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.859665 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cf2acf20-9353-4a11-a09c-3d455a247303-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.874803 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2acf20-9353-4a11-a09c-3d455a247303-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.885632 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk4kv\" (UniqueName: \"kubernetes.io/projected/cf2acf20-9353-4a11-a09c-3d455a247303-kube-api-access-qk4kv\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.887192 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf2acf20-9353-4a11-a09c-3d455a247303-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.929172 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cf2acf20-9353-4a11-a09c-3d455a247303\") " pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.965449 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.968245 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.978022 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.978339 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.978497 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-5rnkd" Dec 01 15:18:13 crc kubenswrapper[4931]: I1201 15:18:13.979416 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.059162 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/474458a3-5f29-4735-bed5-96f2f1d6e352-kolla-config\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.059263 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vck4k\" (UniqueName: \"kubernetes.io/projected/474458a3-5f29-4735-bed5-96f2f1d6e352-kube-api-access-vck4k\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.059298 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/474458a3-5f29-4735-bed5-96f2f1d6e352-memcached-tls-certs\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.059346 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/474458a3-5f29-4735-bed5-96f2f1d6e352-config-data\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.059375 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474458a3-5f29-4735-bed5-96f2f1d6e352-combined-ca-bundle\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.161268 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/474458a3-5f29-4735-bed5-96f2f1d6e352-kolla-config\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.161365 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vck4k\" (UniqueName: \"kubernetes.io/projected/474458a3-5f29-4735-bed5-96f2f1d6e352-kube-api-access-vck4k\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.161412 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/474458a3-5f29-4735-bed5-96f2f1d6e352-memcached-tls-certs\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.161465 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/474458a3-5f29-4735-bed5-96f2f1d6e352-config-data\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.161495 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474458a3-5f29-4735-bed5-96f2f1d6e352-combined-ca-bundle\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.162363 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/474458a3-5f29-4735-bed5-96f2f1d6e352-kolla-config\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.162363 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/474458a3-5f29-4735-bed5-96f2f1d6e352-config-data\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.167524 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/474458a3-5f29-4735-bed5-96f2f1d6e352-memcached-tls-certs\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.178075 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474458a3-5f29-4735-bed5-96f2f1d6e352-combined-ca-bundle\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.179569 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vck4k\" (UniqueName: \"kubernetes.io/projected/474458a3-5f29-4735-bed5-96f2f1d6e352-kube-api-access-vck4k\") pod \"memcached-0\" (UID: \"474458a3-5f29-4735-bed5-96f2f1d6e352\") " pod="openstack/memcached-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.198632 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:14 crc kubenswrapper[4931]: I1201 15:18:14.366318 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 15:18:16 crc kubenswrapper[4931]: I1201 15:18:16.083932 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 15:18:16 crc kubenswrapper[4931]: I1201 15:18:16.096535 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 15:18:16 crc kubenswrapper[4931]: I1201 15:18:16.097085 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 15:18:16 crc kubenswrapper[4931]: I1201 15:18:16.102803 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-xb5jr" Dec 01 15:18:16 crc kubenswrapper[4931]: I1201 15:18:16.193469 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d2gh\" (UniqueName: \"kubernetes.io/projected/5a527180-b5fc-47ab-b4e6-24aa23ae703a-kube-api-access-2d2gh\") pod \"kube-state-metrics-0\" (UID: \"5a527180-b5fc-47ab-b4e6-24aa23ae703a\") " pod="openstack/kube-state-metrics-0" Dec 01 15:18:16 crc kubenswrapper[4931]: I1201 15:18:16.294906 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d2gh\" (UniqueName: \"kubernetes.io/projected/5a527180-b5fc-47ab-b4e6-24aa23ae703a-kube-api-access-2d2gh\") pod \"kube-state-metrics-0\" (UID: \"5a527180-b5fc-47ab-b4e6-24aa23ae703a\") " pod="openstack/kube-state-metrics-0" Dec 01 15:18:16 crc kubenswrapper[4931]: I1201 15:18:16.321458 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d2gh\" (UniqueName: \"kubernetes.io/projected/5a527180-b5fc-47ab-b4e6-24aa23ae703a-kube-api-access-2d2gh\") pod \"kube-state-metrics-0\" (UID: \"5a527180-b5fc-47ab-b4e6-24aa23ae703a\") " pod="openstack/kube-state-metrics-0" Dec 01 15:18:16 crc kubenswrapper[4931]: I1201 15:18:16.427892 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 15:18:18 crc kubenswrapper[4931]: I1201 15:18:17.999862 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa","Type":"ContainerStarted","Data":"124c36de0618c973e66b8096486599ab9b5eb690f79385e49421d7a1d52ad9be"} Dec 01 15:18:18 crc kubenswrapper[4931]: I1201 15:18:18.724519 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.426746 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v8h85"] Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.428053 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.431358 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-mtb6l" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.431568 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.432800 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.450370 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v8h85"] Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.509256 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-cgg9p"] Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.511462 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.525469 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cgg9p"] Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.576194 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj967\" (UniqueName: \"kubernetes.io/projected/6f943374-baa7-4200-93ff-6773c58b032d-kube-api-access-qj967\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.576259 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f943374-baa7-4200-93ff-6773c58b032d-combined-ca-bundle\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.576287 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6f943374-baa7-4200-93ff-6773c58b032d-var-log-ovn\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.576531 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6f943374-baa7-4200-93ff-6773c58b032d-var-run\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.576726 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f943374-baa7-4200-93ff-6773c58b032d-ovn-controller-tls-certs\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.576806 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f943374-baa7-4200-93ff-6773c58b032d-var-run-ovn\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.576958 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f943374-baa7-4200-93ff-6773c58b032d-scripts\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.678411 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f943374-baa7-4200-93ff-6773c58b032d-var-run-ovn\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.678475 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-scripts\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.678522 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f943374-baa7-4200-93ff-6773c58b032d-scripts\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.678551 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj967\" (UniqueName: \"kubernetes.io/projected/6f943374-baa7-4200-93ff-6773c58b032d-kube-api-access-qj967\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.678570 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f943374-baa7-4200-93ff-6773c58b032d-combined-ca-bundle\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.678590 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6f943374-baa7-4200-93ff-6773c58b032d-var-log-ovn\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.678613 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-var-lib\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.678632 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6f943374-baa7-4200-93ff-6773c58b032d-var-run\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.678673 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8h86\" (UniqueName: \"kubernetes.io/projected/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-kube-api-access-f8h86\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.678694 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-var-run\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.678721 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f943374-baa7-4200-93ff-6773c58b032d-ovn-controller-tls-certs\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.679082 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-etc-ovs\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.679182 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-var-log\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.679231 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f943374-baa7-4200-93ff-6773c58b032d-var-run-ovn\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.679321 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6f943374-baa7-4200-93ff-6773c58b032d-var-run\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.679420 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6f943374-baa7-4200-93ff-6773c58b032d-var-log-ovn\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.683347 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f943374-baa7-4200-93ff-6773c58b032d-scripts\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.685269 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f943374-baa7-4200-93ff-6773c58b032d-combined-ca-bundle\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.697470 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f943374-baa7-4200-93ff-6773c58b032d-ovn-controller-tls-certs\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.697640 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj967\" (UniqueName: \"kubernetes.io/projected/6f943374-baa7-4200-93ff-6773c58b032d-kube-api-access-qj967\") pod \"ovn-controller-v8h85\" (UID: \"6f943374-baa7-4200-93ff-6773c58b032d\") " pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.751257 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v8h85" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.780134 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-scripts\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.780230 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-var-lib\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.780276 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8h86\" (UniqueName: \"kubernetes.io/projected/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-kube-api-access-f8h86\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.780295 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-var-run\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.780322 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-etc-ovs\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.780353 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-var-log\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.780639 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-var-log\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.780720 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-var-lib\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.780782 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-var-run\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.781008 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-etc-ovs\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.784440 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-scripts\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.796854 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8h86\" (UniqueName: \"kubernetes.io/projected/a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24-kube-api-access-f8h86\") pod \"ovn-controller-ovs-cgg9p\" (UID: \"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24\") " pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.831080 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.872848 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.872910 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.872960 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.873691 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f551f2eb27cc6e8158d6be30d6ee18e92fc02ccf79b3c9e4d9f5dcf4740103b"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 15:18:19 crc kubenswrapper[4931]: I1201 15:18:19.873759 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://4f551f2eb27cc6e8158d6be30d6ee18e92fc02ccf79b3c9e4d9f5dcf4740103b" gracePeriod=600 Dec 01 15:18:20 crc kubenswrapper[4931]: I1201 15:18:20.016062 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="4f551f2eb27cc6e8158d6be30d6ee18e92fc02ccf79b3c9e4d9f5dcf4740103b" exitCode=0 Dec 01 15:18:20 crc kubenswrapper[4931]: I1201 15:18:20.016106 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"4f551f2eb27cc6e8158d6be30d6ee18e92fc02ccf79b3c9e4d9f5dcf4740103b"} Dec 01 15:18:20 crc kubenswrapper[4931]: I1201 15:18:20.016137 4931 scope.go:117] "RemoveContainer" containerID="2c593bc454b5d325cbd0967c1c5d7f0f229621585e06f8319b965d66c0d93b5d" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.036068 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.039113 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.046006 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.091361 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dsgc2" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.094911 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.099294 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.099368 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.099449 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.100763 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"474458a3-5f29-4735-bed5-96f2f1d6e352","Type":"ContainerStarted","Data":"7ec214c0d038534272564973265062a007c4f45138e70dad154f87620d09d0ff"} Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.138545 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08b55bc-d817-44c5-81f3-6581236b50c1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.138793 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08b55bc-d817-44c5-81f3-6581236b50c1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.138928 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d08b55bc-d817-44c5-81f3-6581236b50c1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.139029 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08b55bc-d817-44c5-81f3-6581236b50c1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.139105 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.139248 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08b55bc-d817-44c5-81f3-6581236b50c1-config\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.139349 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh8m8\" (UniqueName: \"kubernetes.io/projected/d08b55bc-d817-44c5-81f3-6581236b50c1-kube-api-access-nh8m8\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.139441 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d08b55bc-d817-44c5-81f3-6581236b50c1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.211845 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.218896 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.224502 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.224601 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.225122 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.225233 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-pflpj" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.227026 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.241603 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d08b55bc-d817-44c5-81f3-6581236b50c1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.241890 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08b55bc-d817-44c5-81f3-6581236b50c1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.242030 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.242145 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08b55bc-d817-44c5-81f3-6581236b50c1-config\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.242278 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh8m8\" (UniqueName: \"kubernetes.io/projected/d08b55bc-d817-44c5-81f3-6581236b50c1-kube-api-access-nh8m8\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.242405 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d08b55bc-d817-44c5-81f3-6581236b50c1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.242556 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08b55bc-d817-44c5-81f3-6581236b50c1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.242665 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08b55bc-d817-44c5-81f3-6581236b50c1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.243409 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d08b55bc-d817-44c5-81f3-6581236b50c1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.243960 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d08b55bc-d817-44c5-81f3-6581236b50c1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.243986 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.244636 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08b55bc-d817-44c5-81f3-6581236b50c1-config\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.252358 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08b55bc-d817-44c5-81f3-6581236b50c1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.252557 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08b55bc-d817-44c5-81f3-6581236b50c1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.252707 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08b55bc-d817-44c5-81f3-6581236b50c1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.263726 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh8m8\" (UniqueName: \"kubernetes.io/projected/d08b55bc-d817-44c5-81f3-6581236b50c1-kube-api-access-nh8m8\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.269585 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d08b55bc-d817-44c5-81f3-6581236b50c1\") " pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.345479 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.345549 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5fc34b-28c1-4e76-8de7-aa52db803802-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.345584 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fzx2\" (UniqueName: \"kubernetes.io/projected/4f5fc34b-28c1-4e76-8de7-aa52db803802-kube-api-access-8fzx2\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.345639 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f5fc34b-28c1-4e76-8de7-aa52db803802-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.345672 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5fc34b-28c1-4e76-8de7-aa52db803802-config\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.345712 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5fc34b-28c1-4e76-8de7-aa52db803802-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.345742 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5fc34b-28c1-4e76-8de7-aa52db803802-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:23 crc kubenswrapper[4931]: I1201 15:18:23.345778 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f5fc34b-28c1-4e76-8de7-aa52db803802-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.424681 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.446966 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.447014 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5fc34b-28c1-4e76-8de7-aa52db803802-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.447036 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fzx2\" (UniqueName: \"kubernetes.io/projected/4f5fc34b-28c1-4e76-8de7-aa52db803802-kube-api-access-8fzx2\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.447066 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f5fc34b-28c1-4e76-8de7-aa52db803802-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.447091 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5fc34b-28c1-4e76-8de7-aa52db803802-config\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.447114 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5fc34b-28c1-4e76-8de7-aa52db803802-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.447142 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5fc34b-28c1-4e76-8de7-aa52db803802-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.447165 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f5fc34b-28c1-4e76-8de7-aa52db803802-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.447210 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.447816 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f5fc34b-28c1-4e76-8de7-aa52db803802-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.448300 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5fc34b-28c1-4e76-8de7-aa52db803802-config\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.448823 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f5fc34b-28c1-4e76-8de7-aa52db803802-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.452434 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5fc34b-28c1-4e76-8de7-aa52db803802-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.453551 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f5fc34b-28c1-4e76-8de7-aa52db803802-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.463680 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f5fc34b-28c1-4e76-8de7-aa52db803802-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.463776 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fzx2\" (UniqueName: \"kubernetes.io/projected/4f5fc34b-28c1-4e76-8de7-aa52db803802-kube-api-access-8fzx2\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.479276 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4f5fc34b-28c1-4e76-8de7-aa52db803802\") " pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:24 crc kubenswrapper[4931]: I1201 15:18:23.536364 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.100900 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3029394181/1\": happened during read: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.101827 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n5ffh77hb6h5d5h687h6ch84hdbh6h97h5bfh646h5c5h58fh5f6h68ch8h9bh9ch5b7h686h86h89h658h644h6h55dhdbh66dh54dh587hcbq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vck4k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(474458a3-5f29-4735-bed5-96f2f1d6e352): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3029394181/1\": happened during read: context canceled" logger="UnhandledError" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.103028 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage3029394181/1\\\": happened during read: context canceled\"" pod="openstack/memcached-0" podUID="474458a3-5f29-4735-bed5-96f2f1d6e352" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.201152 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.201672 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nkch7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-f58nx_openstack(87514fe1-9d83-41dc-90fb-aae028a6bf34): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.203444 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-f58nx" podUID="87514fe1-9d83-41dc-90fb-aae028a6bf34" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.221427 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.221655 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wwmn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-sfgtv_openstack(0b75a3c4-3990-4ea8-afb0-bb60d15a8b50): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.222943 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" podUID="0b75a3c4-3990-4ea8-afb0-bb60d15a8b50" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.226362 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="474458a3-5f29-4735-bed5-96f2f1d6e352" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.272874 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-f58nx" podUID="87514fe1-9d83-41dc-90fb-aae028a6bf34" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.272952 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.273095 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kdwgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-7lrkc_openstack(c27f51a4-79b4-482c-a9a7-f125c538edfc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.273167 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.273234 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m9htv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-zl7j9_openstack(91033537-0105-4072-bc00-90be0d75b961): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.279139 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" podUID="c27f51a4-79b4-482c-a9a7-f125c538edfc" Dec 01 15:18:38 crc kubenswrapper[4931]: E1201 15:18:38.279139 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-zl7j9" podUID="91033537-0105-4072-bc00-90be0d75b961" Dec 01 15:18:38 crc kubenswrapper[4931]: I1201 15:18:38.627279 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 15:18:38 crc kubenswrapper[4931]: I1201 15:18:38.697365 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 15:18:39 crc kubenswrapper[4931]: E1201 15:18:39.243695 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" podUID="0b75a3c4-3990-4ea8-afb0-bb60d15a8b50" Dec 01 15:18:40 crc kubenswrapper[4931]: E1201 15:18:40.252064 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 01 15:18:40 crc kubenswrapper[4931]: E1201 15:18:40.252550 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s5ctw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:18:40 crc kubenswrapper[4931]: E1201 15:18:40.254405 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa" Dec 01 15:18:40 crc kubenswrapper[4931]: W1201 15:18:40.264930 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf2acf20_9353_4a11_a09c_3d455a247303.slice/crio-fb50144fc3c7931d249b55226b6f405fd5b24f7a6f326fad0c49d9625f312581 WatchSource:0}: Error finding container fb50144fc3c7931d249b55226b6f405fd5b24f7a6f326fad0c49d9625f312581: Status 404 returned error can't find the container with id fb50144fc3c7931d249b55226b6f405fd5b24f7a6f326fad0c49d9625f312581 Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.417962 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.455555 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zl7j9" Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.506295 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c27f51a4-79b4-482c-a9a7-f125c538edfc-dns-svc\") pod \"c27f51a4-79b4-482c-a9a7-f125c538edfc\" (UID: \"c27f51a4-79b4-482c-a9a7-f125c538edfc\") " Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.506363 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27f51a4-79b4-482c-a9a7-f125c538edfc-config\") pod \"c27f51a4-79b4-482c-a9a7-f125c538edfc\" (UID: \"c27f51a4-79b4-482c-a9a7-f125c538edfc\") " Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.506427 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdwgp\" (UniqueName: \"kubernetes.io/projected/c27f51a4-79b4-482c-a9a7-f125c538edfc-kube-api-access-kdwgp\") pod \"c27f51a4-79b4-482c-a9a7-f125c538edfc\" (UID: \"c27f51a4-79b4-482c-a9a7-f125c538edfc\") " Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.507277 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c27f51a4-79b4-482c-a9a7-f125c538edfc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c27f51a4-79b4-482c-a9a7-f125c538edfc" (UID: "c27f51a4-79b4-482c-a9a7-f125c538edfc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.508313 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c27f51a4-79b4-482c-a9a7-f125c538edfc-config" (OuterVolumeSpecName: "config") pod "c27f51a4-79b4-482c-a9a7-f125c538edfc" (UID: "c27f51a4-79b4-482c-a9a7-f125c538edfc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.514609 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27f51a4-79b4-482c-a9a7-f125c538edfc-kube-api-access-kdwgp" (OuterVolumeSpecName: "kube-api-access-kdwgp") pod "c27f51a4-79b4-482c-a9a7-f125c538edfc" (UID: "c27f51a4-79b4-482c-a9a7-f125c538edfc"). InnerVolumeSpecName "kube-api-access-kdwgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.608354 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91033537-0105-4072-bc00-90be0d75b961-config\") pod \"91033537-0105-4072-bc00-90be0d75b961\" (UID: \"91033537-0105-4072-bc00-90be0d75b961\") " Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.608687 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9htv\" (UniqueName: \"kubernetes.io/projected/91033537-0105-4072-bc00-90be0d75b961-kube-api-access-m9htv\") pod \"91033537-0105-4072-bc00-90be0d75b961\" (UID: \"91033537-0105-4072-bc00-90be0d75b961\") " Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.609098 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c27f51a4-79b4-482c-a9a7-f125c538edfc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.609117 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c27f51a4-79b4-482c-a9a7-f125c538edfc-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.609128 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdwgp\" (UniqueName: \"kubernetes.io/projected/c27f51a4-79b4-482c-a9a7-f125c538edfc-kube-api-access-kdwgp\") on node \"crc\" DevicePath \"\"" Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.609718 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91033537-0105-4072-bc00-90be0d75b961-config" (OuterVolumeSpecName: "config") pod "91033537-0105-4072-bc00-90be0d75b961" (UID: "91033537-0105-4072-bc00-90be0d75b961"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.614775 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91033537-0105-4072-bc00-90be0d75b961-kube-api-access-m9htv" (OuterVolumeSpecName: "kube-api-access-m9htv") pod "91033537-0105-4072-bc00-90be0d75b961" (UID: "91033537-0105-4072-bc00-90be0d75b961"). InnerVolumeSpecName "kube-api-access-m9htv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.710349 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91033537-0105-4072-bc00-90be0d75b961-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.710398 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9htv\" (UniqueName: \"kubernetes.io/projected/91033537-0105-4072-bc00-90be0d75b961-kube-api-access-m9htv\") on node \"crc\" DevicePath \"\"" Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.815990 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v8h85"] Dec 01 15:18:40 crc kubenswrapper[4931]: W1201 15:18:40.826792 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f943374_baa7_4200_93ff_6773c58b032d.slice/crio-3465c976f9fb0aa5404edda449ccef82dd8a32533d2fd0452dfa955006a3b45a WatchSource:0}: Error finding container 3465c976f9fb0aa5404edda449ccef82dd8a32533d2fd0452dfa955006a3b45a: Status 404 returned error can't find the container with id 3465c976f9fb0aa5404edda449ccef82dd8a32533d2fd0452dfa955006a3b45a Dec 01 15:18:40 crc kubenswrapper[4931]: W1201 15:18:40.849708 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f5fc34b_28c1_4e76_8de7_aa52db803802.slice/crio-cb11268d035b238856a47c6aa0f43e88ba940098907263be70e7515fcf6c9e76 WatchSource:0}: Error finding container cb11268d035b238856a47c6aa0f43e88ba940098907263be70e7515fcf6c9e76: Status 404 returned error can't find the container with id cb11268d035b238856a47c6aa0f43e88ba940098907263be70e7515fcf6c9e76 Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.849857 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 15:18:40 crc kubenswrapper[4931]: I1201 15:18:40.956529 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cgg9p"] Dec 01 15:18:41 crc kubenswrapper[4931]: W1201 15:18:41.071429 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda35e17f6_4c8d_4b5d_aea6_5e0dc2000a24.slice/crio-5d8324cebcc6823052ddb069d92588ddb6c0699563208d96f8b7248b61a7c7e2 WatchSource:0}: Error finding container 5d8324cebcc6823052ddb069d92588ddb6c0699563208d96f8b7248b61a7c7e2: Status 404 returned error can't find the container with id 5d8324cebcc6823052ddb069d92588ddb6c0699563208d96f8b7248b61a7c7e2 Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.250250 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-zl7j9" event={"ID":"91033537-0105-4072-bc00-90be0d75b961","Type":"ContainerDied","Data":"ec0396d8eae90bc58b8b0d40abce78e9e56e5c96e871c8a6d4a4dadfcc3224b5"} Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.250293 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zl7j9" Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.252534 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cf2acf20-9353-4a11-a09c-3d455a247303","Type":"ContainerStarted","Data":"893db54a05edd0b8bae2ad9fae30581d7e869c303db142288d918cd519099074"} Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.252600 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cf2acf20-9353-4a11-a09c-3d455a247303","Type":"ContainerStarted","Data":"fb50144fc3c7931d249b55226b6f405fd5b24f7a6f326fad0c49d9625f312581"} Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.255263 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v8h85" event={"ID":"6f943374-baa7-4200-93ff-6773c58b032d","Type":"ContainerStarted","Data":"3465c976f9fb0aa5404edda449ccef82dd8a32533d2fd0452dfa955006a3b45a"} Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.257007 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a527180-b5fc-47ab-b4e6-24aa23ae703a","Type":"ContainerStarted","Data":"1c554758ebab44c907e6bdb2febf8b2b0a1e1c96e9345f9c83854edd9d36944d"} Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.260704 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4f5fc34b-28c1-4e76-8de7-aa52db803802","Type":"ContainerStarted","Data":"cb11268d035b238856a47c6aa0f43e88ba940098907263be70e7515fcf6c9e76"} Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.261841 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cgg9p" event={"ID":"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24","Type":"ContainerStarted","Data":"5d8324cebcc6823052ddb069d92588ddb6c0699563208d96f8b7248b61a7c7e2"} Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.265011 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"57835c837fadcd2c88b5f726b0f5a7aef7db7caf224620b840275c3b23741956"} Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.265866 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.266352 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7lrkc" event={"ID":"c27f51a4-79b4-482c-a9a7-f125c538edfc","Type":"ContainerDied","Data":"e586833bbfb376522ee444d4032ce1f41cd18aa915b5088d799f880995d49350"} Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.665174 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zl7j9"] Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.665221 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zl7j9"] Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.713261 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7lrkc"] Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.722145 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7lrkc"] Dec 01 15:18:41 crc kubenswrapper[4931]: I1201 15:18:41.726163 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 15:18:42 crc kubenswrapper[4931]: W1201 15:18:42.031068 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd08b55bc_d817_44c5_81f3_6581236b50c1.slice/crio-c011545529d8fcd4dc5953c1c0d74374a04849ebeebdf788186d73ee7f200523 WatchSource:0}: Error finding container c011545529d8fcd4dc5953c1c0d74374a04849ebeebdf788186d73ee7f200523: Status 404 returned error can't find the container with id c011545529d8fcd4dc5953c1c0d74374a04849ebeebdf788186d73ee7f200523 Dec 01 15:18:42 crc kubenswrapper[4931]: I1201 15:18:42.256588 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91033537-0105-4072-bc00-90be0d75b961" path="/var/lib/kubelet/pods/91033537-0105-4072-bc00-90be0d75b961/volumes" Dec 01 15:18:42 crc kubenswrapper[4931]: I1201 15:18:42.257241 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c27f51a4-79b4-482c-a9a7-f125c538edfc" path="/var/lib/kubelet/pods/c27f51a4-79b4-482c-a9a7-f125c538edfc/volumes" Dec 01 15:18:42 crc kubenswrapper[4931]: I1201 15:18:42.289145 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f8b18d2-d611-4ad6-850a-4ad19544c016","Type":"ContainerStarted","Data":"c4f945b9f807d6cde63f94d15690bb2e2f0ad176652953dc94b9fb21983d50d3"} Dec 01 15:18:42 crc kubenswrapper[4931]: I1201 15:18:42.291439 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d08b55bc-d817-44c5-81f3-6581236b50c1","Type":"ContainerStarted","Data":"c011545529d8fcd4dc5953c1c0d74374a04849ebeebdf788186d73ee7f200523"} Dec 01 15:18:42 crc kubenswrapper[4931]: I1201 15:18:42.292818 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a675ebc0-8c3b-4c43-884f-b32bd954ac6e","Type":"ContainerStarted","Data":"44f0e47d74078ec60006b9d6439ff27887fc34864e6507dde0fec8afdcf53cff"} Dec 01 15:18:45 crc kubenswrapper[4931]: I1201 15:18:45.321063 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa","Type":"ContainerStarted","Data":"04196200f59eeb7f037f6c658acd91764348bf16580fad30120f09030512ab1a"} Dec 01 15:18:46 crc kubenswrapper[4931]: I1201 15:18:46.329537 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d08b55bc-d817-44c5-81f3-6581236b50c1","Type":"ContainerStarted","Data":"97a1efbd9c109138b778c82050165849092e8c34131bb8adcc9e5c6593abde49"} Dec 01 15:18:46 crc kubenswrapper[4931]: I1201 15:18:46.331127 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a527180-b5fc-47ab-b4e6-24aa23ae703a","Type":"ContainerStarted","Data":"851bebbdf9434cc7a0389a8bc6ba04a0717f6cc8f039f733d2e7af4bff28bb03"} Dec 01 15:18:46 crc kubenswrapper[4931]: I1201 15:18:46.331262 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 15:18:46 crc kubenswrapper[4931]: I1201 15:18:46.332692 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4f5fc34b-28c1-4e76-8de7-aa52db803802","Type":"ContainerStarted","Data":"9d78d3dbcd723a9a778e3980aa93d904e9bd3249cf3b748762c6bb85199a7083"} Dec 01 15:18:46 crc kubenswrapper[4931]: I1201 15:18:46.334920 4931 generic.go:334] "Generic (PLEG): container finished" podID="a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24" containerID="41d85e48134e34f13428995bf42b24b154922601bbf91384735f4c402ba1c1aa" exitCode=0 Dec 01 15:18:46 crc kubenswrapper[4931]: I1201 15:18:46.334990 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cgg9p" event={"ID":"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24","Type":"ContainerDied","Data":"41d85e48134e34f13428995bf42b24b154922601bbf91384735f4c402ba1c1aa"} Dec 01 15:18:46 crc kubenswrapper[4931]: I1201 15:18:46.336576 4931 generic.go:334] "Generic (PLEG): container finished" podID="cf2acf20-9353-4a11-a09c-3d455a247303" containerID="893db54a05edd0b8bae2ad9fae30581d7e869c303db142288d918cd519099074" exitCode=0 Dec 01 15:18:46 crc kubenswrapper[4931]: I1201 15:18:46.336629 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cf2acf20-9353-4a11-a09c-3d455a247303","Type":"ContainerDied","Data":"893db54a05edd0b8bae2ad9fae30581d7e869c303db142288d918cd519099074"} Dec 01 15:18:46 crc kubenswrapper[4931]: I1201 15:18:46.338594 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v8h85" event={"ID":"6f943374-baa7-4200-93ff-6773c58b032d","Type":"ContainerStarted","Data":"39358c770b9cc09c761aec73dd0ae152de84b53f694864b60a09e5a189feaefc"} Dec 01 15:18:46 crc kubenswrapper[4931]: I1201 15:18:46.338745 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-v8h85" Dec 01 15:18:46 crc kubenswrapper[4931]: I1201 15:18:46.356168 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.980706707 podStartE2EDuration="30.356145452s" podCreationTimestamp="2025-12-01 15:18:16 +0000 UTC" firstStartedPulling="2025-12-01 15:18:40.278127165 +0000 UTC m=+1066.704000832" lastFinishedPulling="2025-12-01 15:18:45.65356591 +0000 UTC m=+1072.079439577" observedRunningTime="2025-12-01 15:18:46.353778874 +0000 UTC m=+1072.779652561" watchObservedRunningTime="2025-12-01 15:18:46.356145452 +0000 UTC m=+1072.782019129" Dec 01 15:18:46 crc kubenswrapper[4931]: I1201 15:18:46.409447 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-v8h85" podStartSLOduration=22.552132375 podStartE2EDuration="27.409427802s" podCreationTimestamp="2025-12-01 15:18:19 +0000 UTC" firstStartedPulling="2025-12-01 15:18:40.830128019 +0000 UTC m=+1067.256001686" lastFinishedPulling="2025-12-01 15:18:45.687423446 +0000 UTC m=+1072.113297113" observedRunningTime="2025-12-01 15:18:46.404133561 +0000 UTC m=+1072.830007228" watchObservedRunningTime="2025-12-01 15:18:46.409427802 +0000 UTC m=+1072.835301479" Dec 01 15:18:47 crc kubenswrapper[4931]: I1201 15:18:47.351224 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cf2acf20-9353-4a11-a09c-3d455a247303","Type":"ContainerStarted","Data":"28e93d372ba70c7513d14e600eef07cc1316b7f4db0f941cc35c6051544ed359"} Dec 01 15:18:47 crc kubenswrapper[4931]: I1201 15:18:47.355858 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cgg9p" event={"ID":"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24","Type":"ContainerStarted","Data":"da370f49dae0dc003918e50b64a71153d80175f4d8a1808700edb8b43ebd57b7"} Dec 01 15:18:47 crc kubenswrapper[4931]: I1201 15:18:47.355897 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cgg9p" event={"ID":"a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24","Type":"ContainerStarted","Data":"06f5c0c819ffc5742a9a01c1a4c3a57f8fa6a55859eebc0a7fe02fc2bfdc21a0"} Dec 01 15:18:47 crc kubenswrapper[4931]: I1201 15:18:47.355912 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:47 crc kubenswrapper[4931]: I1201 15:18:47.355943 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:18:47 crc kubenswrapper[4931]: I1201 15:18:47.374497 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=34.893624182 podStartE2EDuration="35.374477855s" podCreationTimestamp="2025-12-01 15:18:12 +0000 UTC" firstStartedPulling="2025-12-01 15:18:40.278217928 +0000 UTC m=+1066.704091605" lastFinishedPulling="2025-12-01 15:18:40.759071611 +0000 UTC m=+1067.184945278" observedRunningTime="2025-12-01 15:18:47.372763966 +0000 UTC m=+1073.798637653" watchObservedRunningTime="2025-12-01 15:18:47.374477855 +0000 UTC m=+1073.800351522" Dec 01 15:18:47 crc kubenswrapper[4931]: I1201 15:18:47.394082 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-cgg9p" podStartSLOduration=23.81434863 podStartE2EDuration="28.394056604s" podCreationTimestamp="2025-12-01 15:18:19 +0000 UTC" firstStartedPulling="2025-12-01 15:18:41.073847915 +0000 UTC m=+1067.499721582" lastFinishedPulling="2025-12-01 15:18:45.653555889 +0000 UTC m=+1072.079429556" observedRunningTime="2025-12-01 15:18:47.390553954 +0000 UTC m=+1073.816427641" watchObservedRunningTime="2025-12-01 15:18:47.394056604 +0000 UTC m=+1073.819930271" Dec 01 15:18:49 crc kubenswrapper[4931]: I1201 15:18:49.370617 4931 generic.go:334] "Generic (PLEG): container finished" podID="6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa" containerID="04196200f59eeb7f037f6c658acd91764348bf16580fad30120f09030512ab1a" exitCode=0 Dec 01 15:18:49 crc kubenswrapper[4931]: I1201 15:18:49.370691 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa","Type":"ContainerDied","Data":"04196200f59eeb7f037f6c658acd91764348bf16580fad30120f09030512ab1a"} Dec 01 15:18:49 crc kubenswrapper[4931]: I1201 15:18:49.372819 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d08b55bc-d817-44c5-81f3-6581236b50c1","Type":"ContainerStarted","Data":"767b9c4fa9560ca176d9a9eae6bcb15605c9e87069bb6899c230d8a7db549fe7"} Dec 01 15:18:49 crc kubenswrapper[4931]: I1201 15:18:49.375510 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4f5fc34b-28c1-4e76-8de7-aa52db803802","Type":"ContainerStarted","Data":"2b143a845281b04da25609289be281f1ea0f97a1782f94f4bf8ce108edb556f1"} Dec 01 15:18:49 crc kubenswrapper[4931]: I1201 15:18:49.413675 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=21.453892071 podStartE2EDuration="28.413659383s" podCreationTimestamp="2025-12-01 15:18:21 +0000 UTC" firstStartedPulling="2025-12-01 15:18:42.034314417 +0000 UTC m=+1068.460188094" lastFinishedPulling="2025-12-01 15:18:48.994081739 +0000 UTC m=+1075.419955406" observedRunningTime="2025-12-01 15:18:49.411596395 +0000 UTC m=+1075.837470062" watchObservedRunningTime="2025-12-01 15:18:49.413659383 +0000 UTC m=+1075.839533050" Dec 01 15:18:49 crc kubenswrapper[4931]: I1201 15:18:49.428242 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=19.258902157 podStartE2EDuration="27.428225479s" podCreationTimestamp="2025-12-01 15:18:22 +0000 UTC" firstStartedPulling="2025-12-01 15:18:40.853302211 +0000 UTC m=+1067.279175878" lastFinishedPulling="2025-12-01 15:18:49.022625533 +0000 UTC m=+1075.448499200" observedRunningTime="2025-12-01 15:18:49.427976042 +0000 UTC m=+1075.853849709" watchObservedRunningTime="2025-12-01 15:18:49.428225479 +0000 UTC m=+1075.854099146" Dec 01 15:18:50 crc kubenswrapper[4931]: E1201 15:18:50.226296 4931 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.2:42352->38.102.83.2:42215: write tcp 38.102.83.2:42352->38.102.83.2:42215: write: broken pipe Dec 01 15:18:50 crc kubenswrapper[4931]: I1201 15:18:50.383822 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa","Type":"ContainerStarted","Data":"b06b0366bc73392233330d4b5f409960b00c343c10cb1a9262e1200ac8ef8e88"} Dec 01 15:18:50 crc kubenswrapper[4931]: I1201 15:18:50.402697 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371997.452095 podStartE2EDuration="39.40268009s" podCreationTimestamp="2025-12-01 15:18:11 +0000 UTC" firstStartedPulling="2025-12-01 15:18:17.642301434 +0000 UTC m=+1044.068175111" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:18:50.399085318 +0000 UTC m=+1076.824958985" watchObservedRunningTime="2025-12-01 15:18:50.40268009 +0000 UTC m=+1076.828553757" Dec 01 15:18:50 crc kubenswrapper[4931]: I1201 15:18:50.425915 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:50 crc kubenswrapper[4931]: I1201 15:18:50.465323 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:50 crc kubenswrapper[4931]: I1201 15:18:50.537206 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:50 crc kubenswrapper[4931]: I1201 15:18:50.573338 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.390568 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.391045 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.438163 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.442612 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.695802 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sfgtv"] Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.743919 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-tnvwb"] Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.745139 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.749328 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.789559 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-tnvwb"] Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.825289 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xhksj"] Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.826567 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.828773 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.840077 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xhksj"] Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.883662 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.885243 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.890523 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.890719 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.891115 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-mpljf" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.891255 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.893854 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0476eb16-ff0d-476b-b6da-dd437e123f26-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.893883 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcx8f\" (UniqueName: \"kubernetes.io/projected/0476eb16-ff0d-476b-b6da-dd437e123f26-kube-api-access-wcx8f\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.893944 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0476eb16-ff0d-476b-b6da-dd437e123f26-ovn-rundir\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.893967 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0476eb16-ff0d-476b-b6da-dd437e123f26-config\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.894032 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d62g\" (UniqueName: \"kubernetes.io/projected/6957cd98-d749-411e-b8ac-bcfafbdfeb35-kube-api-access-4d62g\") pod \"dnsmasq-dns-6bc7876d45-tnvwb\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.894094 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-tnvwb\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.894125 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0476eb16-ff0d-476b-b6da-dd437e123f26-ovs-rundir\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.894255 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-tnvwb\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.894310 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0476eb16-ff0d-476b-b6da-dd437e123f26-combined-ca-bundle\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.894374 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-config\") pod \"dnsmasq-dns-6bc7876d45-tnvwb\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.904311 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.928210 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-f58nx"] Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.962166 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-th96h"] Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.967416 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:51 crc kubenswrapper[4931]: I1201 15:18:51.970418 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000323 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-scripts\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000369 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0476eb16-ff0d-476b-b6da-dd437e123f26-ovs-rundir\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000406 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000434 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-tnvwb\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000456 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000472 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frs7w\" (UniqueName: \"kubernetes.io/projected/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-kube-api-access-frs7w\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000496 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0476eb16-ff0d-476b-b6da-dd437e123f26-combined-ca-bundle\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000532 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-config\") pod \"dnsmasq-dns-6bc7876d45-tnvwb\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000553 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-config\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000572 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0476eb16-ff0d-476b-b6da-dd437e123f26-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000590 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcx8f\" (UniqueName: \"kubernetes.io/projected/0476eb16-ff0d-476b-b6da-dd437e123f26-kube-api-access-wcx8f\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000623 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0476eb16-ff0d-476b-b6da-dd437e123f26-ovn-rundir\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000643 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0476eb16-ff0d-476b-b6da-dd437e123f26-config\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000656 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000675 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d62g\" (UniqueName: \"kubernetes.io/projected/6957cd98-d749-411e-b8ac-bcfafbdfeb35-kube-api-access-4d62g\") pod \"dnsmasq-dns-6bc7876d45-tnvwb\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000696 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000704 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0476eb16-ff0d-476b-b6da-dd437e123f26-ovs-rundir\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.000726 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-tnvwb\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.001516 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0476eb16-ff0d-476b-b6da-dd437e123f26-ovn-rundir\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.001586 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-tnvwb\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.001786 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0476eb16-ff0d-476b-b6da-dd437e123f26-config\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.002024 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-config\") pod \"dnsmasq-dns-6bc7876d45-tnvwb\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.002415 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-tnvwb\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.008158 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0476eb16-ff0d-476b-b6da-dd437e123f26-combined-ca-bundle\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.016263 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-th96h"] Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.018805 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0476eb16-ff0d-476b-b6da-dd437e123f26-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.022825 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcx8f\" (UniqueName: \"kubernetes.io/projected/0476eb16-ff0d-476b-b6da-dd437e123f26-kube-api-access-wcx8f\") pod \"ovn-controller-metrics-xhksj\" (UID: \"0476eb16-ff0d-476b-b6da-dd437e123f26\") " pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.027006 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d62g\" (UniqueName: \"kubernetes.io/projected/6957cd98-d749-411e-b8ac-bcfafbdfeb35-kube-api-access-4d62g\") pod \"dnsmasq-dns-6bc7876d45-tnvwb\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.091400 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.101735 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.101776 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.101810 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-config\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.101833 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.101852 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-scripts\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.101873 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.101946 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.101965 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.101982 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frs7w\" (UniqueName: \"kubernetes.io/projected/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-kube-api-access-frs7w\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.102016 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-dns-svc\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.102045 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-config\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.102069 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqn8d\" (UniqueName: \"kubernetes.io/projected/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-kube-api-access-dqn8d\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.102788 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.103288 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-config\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.104642 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-scripts\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.106258 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.107750 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.111890 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.120351 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frs7w\" (UniqueName: \"kubernetes.io/projected/ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47-kube-api-access-frs7w\") pod \"ovn-northd-0\" (UID: \"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47\") " pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.146327 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xhksj" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.202929 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.203174 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-dns-svc\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.203212 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqn8d\" (UniqueName: \"kubernetes.io/projected/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-kube-api-access-dqn8d\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.203266 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-config\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.203287 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.204117 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.205527 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-dns-svc\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.205969 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-config\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.206233 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.207552 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.223446 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.235431 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqn8d\" (UniqueName: \"kubernetes.io/projected/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-kube-api-access-dqn8d\") pod \"dnsmasq-dns-8554648995-th96h\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.284935 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.295488 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-f58nx" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.304426 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-config\") pod \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\" (UID: \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\") " Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.304477 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-dns-svc\") pod \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\" (UID: \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\") " Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.304571 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wwmn\" (UniqueName: \"kubernetes.io/projected/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-kube-api-access-8wwmn\") pod \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\" (UID: \"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50\") " Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.305199 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0b75a3c4-3990-4ea8-afb0-bb60d15a8b50" (UID: "0b75a3c4-3990-4ea8-afb0-bb60d15a8b50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.305210 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-config" (OuterVolumeSpecName: "config") pod "0b75a3c4-3990-4ea8-afb0-bb60d15a8b50" (UID: "0b75a3c4-3990-4ea8-afb0-bb60d15a8b50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.308487 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-kube-api-access-8wwmn" (OuterVolumeSpecName: "kube-api-access-8wwmn") pod "0b75a3c4-3990-4ea8-afb0-bb60d15a8b50" (UID: "0b75a3c4-3990-4ea8-afb0-bb60d15a8b50"). InnerVolumeSpecName "kube-api-access-8wwmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.405730 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87514fe1-9d83-41dc-90fb-aae028a6bf34-config\") pod \"87514fe1-9d83-41dc-90fb-aae028a6bf34\" (UID: \"87514fe1-9d83-41dc-90fb-aae028a6bf34\") " Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.405794 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkch7\" (UniqueName: \"kubernetes.io/projected/87514fe1-9d83-41dc-90fb-aae028a6bf34-kube-api-access-nkch7\") pod \"87514fe1-9d83-41dc-90fb-aae028a6bf34\" (UID: \"87514fe1-9d83-41dc-90fb-aae028a6bf34\") " Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.405834 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87514fe1-9d83-41dc-90fb-aae028a6bf34-dns-svc\") pod \"87514fe1-9d83-41dc-90fb-aae028a6bf34\" (UID: \"87514fe1-9d83-41dc-90fb-aae028a6bf34\") " Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.406142 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.406158 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.406168 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wwmn\" (UniqueName: \"kubernetes.io/projected/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50-kube-api-access-8wwmn\") on node \"crc\" DevicePath \"\"" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.406605 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-f58nx" event={"ID":"87514fe1-9d83-41dc-90fb-aae028a6bf34","Type":"ContainerDied","Data":"1d41ecf99b02b1044f7551f2e8d246eddba16590835b1bedf402866236bfea92"} Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.406708 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-f58nx" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.406839 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87514fe1-9d83-41dc-90fb-aae028a6bf34-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87514fe1-9d83-41dc-90fb-aae028a6bf34" (UID: "87514fe1-9d83-41dc-90fb-aae028a6bf34"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.407116 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87514fe1-9d83-41dc-90fb-aae028a6bf34-config" (OuterVolumeSpecName: "config") pod "87514fe1-9d83-41dc-90fb-aae028a6bf34" (UID: "87514fe1-9d83-41dc-90fb-aae028a6bf34"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.409765 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87514fe1-9d83-41dc-90fb-aae028a6bf34-kube-api-access-nkch7" (OuterVolumeSpecName: "kube-api-access-nkch7") pod "87514fe1-9d83-41dc-90fb-aae028a6bf34" (UID: "87514fe1-9d83-41dc-90fb-aae028a6bf34"). InnerVolumeSpecName "kube-api-access-nkch7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.413541 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" event={"ID":"0b75a3c4-3990-4ea8-afb0-bb60d15a8b50","Type":"ContainerDied","Data":"305b43b75fb1212ffe77dec5f61c484a2f013b8658d9e26b06572e6fba7c128f"} Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.413655 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-sfgtv" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.463095 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sfgtv"] Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.466001 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-sfgtv"] Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.508641 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkch7\" (UniqueName: \"kubernetes.io/projected/87514fe1-9d83-41dc-90fb-aae028a6bf34-kube-api-access-nkch7\") on node \"crc\" DevicePath \"\"" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.508671 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87514fe1-9d83-41dc-90fb-aae028a6bf34-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.508681 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87514fe1-9d83-41dc-90fb-aae028a6bf34-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.514403 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.514449 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.547440 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-tnvwb"] Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.687263 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xhksj"] Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.781779 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.807352 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-f58nx"] Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.811962 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-f58nx"] Dec 01 15:18:52 crc kubenswrapper[4931]: I1201 15:18:52.842036 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-th96h"] Dec 01 15:18:53 crc kubenswrapper[4931]: I1201 15:18:53.437506 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xhksj" event={"ID":"0476eb16-ff0d-476b-b6da-dd437e123f26","Type":"ContainerStarted","Data":"10c6863c60ff9aa49efc166bb6b7eb92ce76fedcd45a6fad853555a0eef262f7"} Dec 01 15:18:53 crc kubenswrapper[4931]: I1201 15:18:53.438937 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xhksj" event={"ID":"0476eb16-ff0d-476b-b6da-dd437e123f26","Type":"ContainerStarted","Data":"7776feb4d0c8b3ad02dfee2e26828879e99aec2702fc8bac3a749f5decfe4690"} Dec 01 15:18:53 crc kubenswrapper[4931]: I1201 15:18:53.440168 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-th96h" event={"ID":"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e","Type":"ContainerStarted","Data":"bda28af69f40fb9218833ee2c7cef5d1c70765788ea70dc31ca7675a34615c0e"} Dec 01 15:18:53 crc kubenswrapper[4931]: I1201 15:18:53.442839 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" event={"ID":"6957cd98-d749-411e-b8ac-bcfafbdfeb35","Type":"ContainerStarted","Data":"84407efb71b45de01fd41879dd7abf923367dc6d171608855733f039ba346657"} Dec 01 15:18:53 crc kubenswrapper[4931]: I1201 15:18:53.444371 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47","Type":"ContainerStarted","Data":"ba913387db9ed7b1f94f5e52f53079236ca7d9d96e2de36d94c078fcf9f1c609"} Dec 01 15:18:53 crc kubenswrapper[4931]: I1201 15:18:53.476927 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xhksj" podStartSLOduration=2.476546167 podStartE2EDuration="2.476546167s" podCreationTimestamp="2025-12-01 15:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:18:53.46648074 +0000 UTC m=+1079.892354427" watchObservedRunningTime="2025-12-01 15:18:53.476546167 +0000 UTC m=+1079.902419864" Dec 01 15:18:54 crc kubenswrapper[4931]: I1201 15:18:54.199606 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:54 crc kubenswrapper[4931]: I1201 15:18:54.199983 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:54 crc kubenswrapper[4931]: I1201 15:18:54.256585 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b75a3c4-3990-4ea8-afb0-bb60d15a8b50" path="/var/lib/kubelet/pods/0b75a3c4-3990-4ea8-afb0-bb60d15a8b50/volumes" Dec 01 15:18:54 crc kubenswrapper[4931]: I1201 15:18:54.256978 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87514fe1-9d83-41dc-90fb-aae028a6bf34" path="/var/lib/kubelet/pods/87514fe1-9d83-41dc-90fb-aae028a6bf34/volumes" Dec 01 15:18:54 crc kubenswrapper[4931]: I1201 15:18:54.301249 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:54 crc kubenswrapper[4931]: I1201 15:18:54.454086 4931 generic.go:334] "Generic (PLEG): container finished" podID="66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" containerID="6368c91757c6f705763644e048eda3d1de319803441b16a54763e4341a079a97" exitCode=0 Dec 01 15:18:54 crc kubenswrapper[4931]: I1201 15:18:54.454153 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-th96h" event={"ID":"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e","Type":"ContainerDied","Data":"6368c91757c6f705763644e048eda3d1de319803441b16a54763e4341a079a97"} Dec 01 15:18:54 crc kubenswrapper[4931]: I1201 15:18:54.457858 4931 generic.go:334] "Generic (PLEG): container finished" podID="6957cd98-d749-411e-b8ac-bcfafbdfeb35" containerID="3940395c07d6cfa2099bdada3fb495edd64a60525703d81388fe3cbeeec6519a" exitCode=0 Dec 01 15:18:54 crc kubenswrapper[4931]: I1201 15:18:54.457921 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" event={"ID":"6957cd98-d749-411e-b8ac-bcfafbdfeb35","Type":"ContainerDied","Data":"3940395c07d6cfa2099bdada3fb495edd64a60525703d81388fe3cbeeec6519a"} Dec 01 15:18:54 crc kubenswrapper[4931]: I1201 15:18:54.556943 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 01 15:18:54 crc kubenswrapper[4931]: I1201 15:18:54.905521 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 01 15:18:55 crc kubenswrapper[4931]: I1201 15:18:55.027966 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 01 15:18:55 crc kubenswrapper[4931]: I1201 15:18:55.481161 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47","Type":"ContainerStarted","Data":"4641de8d3843ffdab09f06b70b4961d17c32043276a40ee266e1b3d930bfc077"} Dec 01 15:18:55 crc kubenswrapper[4931]: I1201 15:18:55.487024 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-th96h" event={"ID":"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e","Type":"ContainerStarted","Data":"caa800ac4c84c83e9417dadc3c2b16e0455be1f8b54108ba8224c7389f988d79"} Dec 01 15:18:55 crc kubenswrapper[4931]: I1201 15:18:55.488152 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:18:55 crc kubenswrapper[4931]: I1201 15:18:55.497280 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" event={"ID":"6957cd98-d749-411e-b8ac-bcfafbdfeb35","Type":"ContainerStarted","Data":"20af9f66173ba7da9fa0b80affe1e95a4b053883fd44c3328f89ee4785cd7e66"} Dec 01 15:18:55 crc kubenswrapper[4931]: I1201 15:18:55.497429 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:18:55 crc kubenswrapper[4931]: I1201 15:18:55.505622 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-th96h" podStartSLOduration=4.037862997 podStartE2EDuration="4.505605996s" podCreationTimestamp="2025-12-01 15:18:51 +0000 UTC" firstStartedPulling="2025-12-01 15:18:52.84348795 +0000 UTC m=+1079.269361617" lastFinishedPulling="2025-12-01 15:18:53.311230929 +0000 UTC m=+1079.737104616" observedRunningTime="2025-12-01 15:18:55.503410283 +0000 UTC m=+1081.929283970" watchObservedRunningTime="2025-12-01 15:18:55.505605996 +0000 UTC m=+1081.931479663" Dec 01 15:18:55 crc kubenswrapper[4931]: I1201 15:18:55.523851 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" podStartSLOduration=3.7711705049999997 podStartE2EDuration="4.523834396s" podCreationTimestamp="2025-12-01 15:18:51 +0000 UTC" firstStartedPulling="2025-12-01 15:18:52.557424966 +0000 UTC m=+1078.983298633" lastFinishedPulling="2025-12-01 15:18:53.310088857 +0000 UTC m=+1079.735962524" observedRunningTime="2025-12-01 15:18:55.520516341 +0000 UTC m=+1081.946390028" watchObservedRunningTime="2025-12-01 15:18:55.523834396 +0000 UTC m=+1081.949708063" Dec 01 15:18:56 crc kubenswrapper[4931]: I1201 15:18:56.433450 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 15:18:56 crc kubenswrapper[4931]: I1201 15:18:56.511612 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"474458a3-5f29-4735-bed5-96f2f1d6e352","Type":"ContainerStarted","Data":"74bdc5c8d123c21ab52d07e9632d14dc652c91a63b5149014784571e73808108"} Dec 01 15:18:56 crc kubenswrapper[4931]: I1201 15:18:56.512165 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 01 15:18:56 crc kubenswrapper[4931]: I1201 15:18:56.514012 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47","Type":"ContainerStarted","Data":"c4f57d4fc6cab2e1b35f6924eff1211d5006106c34155b3965bd60b6c98a9f39"} Dec 01 15:18:56 crc kubenswrapper[4931]: I1201 15:18:56.514524 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 01 15:18:56 crc kubenswrapper[4931]: I1201 15:18:56.540297 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=10.061581629 podStartE2EDuration="43.540277075s" podCreationTimestamp="2025-12-01 15:18:13 +0000 UTC" firstStartedPulling="2025-12-01 15:18:22.303178574 +0000 UTC m=+1048.729052231" lastFinishedPulling="2025-12-01 15:18:55.78187399 +0000 UTC m=+1082.207747677" observedRunningTime="2025-12-01 15:18:56.537435964 +0000 UTC m=+1082.963309631" watchObservedRunningTime="2025-12-01 15:18:56.540277075 +0000 UTC m=+1082.966150742" Dec 01 15:18:56 crc kubenswrapper[4931]: I1201 15:18:56.565795 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.6581212990000003 podStartE2EDuration="5.565763773s" podCreationTimestamp="2025-12-01 15:18:51 +0000 UTC" firstStartedPulling="2025-12-01 15:18:52.796349584 +0000 UTC m=+1079.222223251" lastFinishedPulling="2025-12-01 15:18:54.703992058 +0000 UTC m=+1081.129865725" observedRunningTime="2025-12-01 15:18:56.565178036 +0000 UTC m=+1082.991051703" watchObservedRunningTime="2025-12-01 15:18:56.565763773 +0000 UTC m=+1082.991637440" Dec 01 15:19:02 crc kubenswrapper[4931]: I1201 15:19:02.094713 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:19:02 crc kubenswrapper[4931]: I1201 15:19:02.286551 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:19:02 crc kubenswrapper[4931]: I1201 15:19:02.345931 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-tnvwb"] Dec 01 15:19:02 crc kubenswrapper[4931]: I1201 15:19:02.563996 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" podUID="6957cd98-d749-411e-b8ac-bcfafbdfeb35" containerName="dnsmasq-dns" containerID="cri-o://20af9f66173ba7da9fa0b80affe1e95a4b053883fd44c3328f89ee4785cd7e66" gracePeriod=10 Dec 01 15:19:03 crc kubenswrapper[4931]: I1201 15:19:03.883580 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-788c-account-create-update-fg8q4"] Dec 01 15:19:03 crc kubenswrapper[4931]: I1201 15:19:03.886446 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-788c-account-create-update-fg8q4" Dec 01 15:19:03 crc kubenswrapper[4931]: I1201 15:19:03.889716 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 01 15:19:03 crc kubenswrapper[4931]: I1201 15:19:03.894517 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-788c-account-create-update-fg8q4"] Dec 01 15:19:03 crc kubenswrapper[4931]: I1201 15:19:03.936898 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-65sr4"] Dec 01 15:19:03 crc kubenswrapper[4931]: I1201 15:19:03.938127 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-65sr4" Dec 01 15:19:03 crc kubenswrapper[4931]: I1201 15:19:03.953819 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-65sr4"] Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.032251 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4wm6\" (UniqueName: \"kubernetes.io/projected/27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c-kube-api-access-g4wm6\") pod \"keystone-788c-account-create-update-fg8q4\" (UID: \"27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c\") " pod="openstack/keystone-788c-account-create-update-fg8q4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.032839 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c-operator-scripts\") pod \"keystone-788c-account-create-update-fg8q4\" (UID: \"27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c\") " pod="openstack/keystone-788c-account-create-update-fg8q4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.134479 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4wm6\" (UniqueName: \"kubernetes.io/projected/27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c-kube-api-access-g4wm6\") pod \"keystone-788c-account-create-update-fg8q4\" (UID: \"27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c\") " pod="openstack/keystone-788c-account-create-update-fg8q4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.134538 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk26h\" (UniqueName: \"kubernetes.io/projected/bce0b7ec-9061-4965-8139-ebf2da3036be-kube-api-access-jk26h\") pod \"keystone-db-create-65sr4\" (UID: \"bce0b7ec-9061-4965-8139-ebf2da3036be\") " pod="openstack/keystone-db-create-65sr4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.134580 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce0b7ec-9061-4965-8139-ebf2da3036be-operator-scripts\") pod \"keystone-db-create-65sr4\" (UID: \"bce0b7ec-9061-4965-8139-ebf2da3036be\") " pod="openstack/keystone-db-create-65sr4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.134606 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c-operator-scripts\") pod \"keystone-788c-account-create-update-fg8q4\" (UID: \"27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c\") " pod="openstack/keystone-788c-account-create-update-fg8q4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.135312 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c-operator-scripts\") pod \"keystone-788c-account-create-update-fg8q4\" (UID: \"27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c\") " pod="openstack/keystone-788c-account-create-update-fg8q4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.135398 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-sr224"] Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.137546 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sr224" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.140543 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-sr224"] Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.169623 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4wm6\" (UniqueName: \"kubernetes.io/projected/27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c-kube-api-access-g4wm6\") pod \"keystone-788c-account-create-update-fg8q4\" (UID: \"27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c\") " pod="openstack/keystone-788c-account-create-update-fg8q4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.217873 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-788c-account-create-update-fg8q4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.237239 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x48m8\" (UniqueName: \"kubernetes.io/projected/c8725b44-71db-414e-a092-a684774ccc44-kube-api-access-x48m8\") pod \"placement-db-create-sr224\" (UID: \"c8725b44-71db-414e-a092-a684774ccc44\") " pod="openstack/placement-db-create-sr224" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.237664 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk26h\" (UniqueName: \"kubernetes.io/projected/bce0b7ec-9061-4965-8139-ebf2da3036be-kube-api-access-jk26h\") pod \"keystone-db-create-65sr4\" (UID: \"bce0b7ec-9061-4965-8139-ebf2da3036be\") " pod="openstack/keystone-db-create-65sr4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.237779 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce0b7ec-9061-4965-8139-ebf2da3036be-operator-scripts\") pod \"keystone-db-create-65sr4\" (UID: \"bce0b7ec-9061-4965-8139-ebf2da3036be\") " pod="openstack/keystone-db-create-65sr4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.237953 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8725b44-71db-414e-a092-a684774ccc44-operator-scripts\") pod \"placement-db-create-sr224\" (UID: \"c8725b44-71db-414e-a092-a684774ccc44\") " pod="openstack/placement-db-create-sr224" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.238548 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce0b7ec-9061-4965-8139-ebf2da3036be-operator-scripts\") pod \"keystone-db-create-65sr4\" (UID: \"bce0b7ec-9061-4965-8139-ebf2da3036be\") " pod="openstack/keystone-db-create-65sr4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.257740 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-86ad-account-create-update-cjxmb"] Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.258433 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk26h\" (UniqueName: \"kubernetes.io/projected/bce0b7ec-9061-4965-8139-ebf2da3036be-kube-api-access-jk26h\") pod \"keystone-db-create-65sr4\" (UID: \"bce0b7ec-9061-4965-8139-ebf2da3036be\") " pod="openstack/keystone-db-create-65sr4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.258972 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-86ad-account-create-update-cjxmb"] Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.259061 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-86ad-account-create-update-cjxmb" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.261453 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.374611 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.378752 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8725b44-71db-414e-a092-a684774ccc44-operator-scripts\") pod \"placement-db-create-sr224\" (UID: \"c8725b44-71db-414e-a092-a684774ccc44\") " pod="openstack/placement-db-create-sr224" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.378850 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x48m8\" (UniqueName: \"kubernetes.io/projected/c8725b44-71db-414e-a092-a684774ccc44-kube-api-access-x48m8\") pod \"placement-db-create-sr224\" (UID: \"c8725b44-71db-414e-a092-a684774ccc44\") " pod="openstack/placement-db-create-sr224" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.378930 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9447449e-ba97-4ec6-b8c6-bdc2b54aa746-operator-scripts\") pod \"placement-86ad-account-create-update-cjxmb\" (UID: \"9447449e-ba97-4ec6-b8c6-bdc2b54aa746\") " pod="openstack/placement-86ad-account-create-update-cjxmb" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.379037 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbz6c\" (UniqueName: \"kubernetes.io/projected/9447449e-ba97-4ec6-b8c6-bdc2b54aa746-kube-api-access-kbz6c\") pod \"placement-86ad-account-create-update-cjxmb\" (UID: \"9447449e-ba97-4ec6-b8c6-bdc2b54aa746\") " pod="openstack/placement-86ad-account-create-update-cjxmb" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.379547 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8725b44-71db-414e-a092-a684774ccc44-operator-scripts\") pod \"placement-db-create-sr224\" (UID: \"c8725b44-71db-414e-a092-a684774ccc44\") " pod="openstack/placement-db-create-sr224" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.400015 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x48m8\" (UniqueName: \"kubernetes.io/projected/c8725b44-71db-414e-a092-a684774ccc44-kube-api-access-x48m8\") pod \"placement-db-create-sr224\" (UID: \"c8725b44-71db-414e-a092-a684774ccc44\") " pod="openstack/placement-db-create-sr224" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.461883 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sr224" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.480663 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9447449e-ba97-4ec6-b8c6-bdc2b54aa746-operator-scripts\") pod \"placement-86ad-account-create-update-cjxmb\" (UID: \"9447449e-ba97-4ec6-b8c6-bdc2b54aa746\") " pod="openstack/placement-86ad-account-create-update-cjxmb" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.480730 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbz6c\" (UniqueName: \"kubernetes.io/projected/9447449e-ba97-4ec6-b8c6-bdc2b54aa746-kube-api-access-kbz6c\") pod \"placement-86ad-account-create-update-cjxmb\" (UID: \"9447449e-ba97-4ec6-b8c6-bdc2b54aa746\") " pod="openstack/placement-86ad-account-create-update-cjxmb" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.482765 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9447449e-ba97-4ec6-b8c6-bdc2b54aa746-operator-scripts\") pod \"placement-86ad-account-create-update-cjxmb\" (UID: \"9447449e-ba97-4ec6-b8c6-bdc2b54aa746\") " pod="openstack/placement-86ad-account-create-update-cjxmb" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.524261 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbz6c\" (UniqueName: \"kubernetes.io/projected/9447449e-ba97-4ec6-b8c6-bdc2b54aa746-kube-api-access-kbz6c\") pod \"placement-86ad-account-create-update-cjxmb\" (UID: \"9447449e-ba97-4ec6-b8c6-bdc2b54aa746\") " pod="openstack/placement-86ad-account-create-update-cjxmb" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.558139 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-65sr4" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.597287 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-5gpcp"] Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.597962 4931 generic.go:334] "Generic (PLEG): container finished" podID="6957cd98-d749-411e-b8ac-bcfafbdfeb35" containerID="20af9f66173ba7da9fa0b80affe1e95a4b053883fd44c3328f89ee4785cd7e66" exitCode=0 Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.598312 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" event={"ID":"6957cd98-d749-411e-b8ac-bcfafbdfeb35","Type":"ContainerDied","Data":"20af9f66173ba7da9fa0b80affe1e95a4b053883fd44c3328f89ee4785cd7e66"} Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.599646 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5gpcp" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.619603 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5gpcp"] Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.689510 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9czg\" (UniqueName: \"kubernetes.io/projected/792a6a53-2fbf-4393-8464-f18ead2b290d-kube-api-access-d9czg\") pod \"glance-db-create-5gpcp\" (UID: \"792a6a53-2fbf-4393-8464-f18ead2b290d\") " pod="openstack/glance-db-create-5gpcp" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.689896 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792a6a53-2fbf-4393-8464-f18ead2b290d-operator-scripts\") pod \"glance-db-create-5gpcp\" (UID: \"792a6a53-2fbf-4393-8464-f18ead2b290d\") " pod="openstack/glance-db-create-5gpcp" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.700713 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-70d9-account-create-update-c6rqw"] Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.701645 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-70d9-account-create-update-c6rqw" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.706793 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.720537 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-70d9-account-create-update-c6rqw"] Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.774010 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-86ad-account-create-update-cjxmb" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.793392 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-788c-account-create-update-fg8q4"] Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.802255 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9czg\" (UniqueName: \"kubernetes.io/projected/792a6a53-2fbf-4393-8464-f18ead2b290d-kube-api-access-d9czg\") pod \"glance-db-create-5gpcp\" (UID: \"792a6a53-2fbf-4393-8464-f18ead2b290d\") " pod="openstack/glance-db-create-5gpcp" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.802346 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792a6a53-2fbf-4393-8464-f18ead2b290d-operator-scripts\") pod \"glance-db-create-5gpcp\" (UID: \"792a6a53-2fbf-4393-8464-f18ead2b290d\") " pod="openstack/glance-db-create-5gpcp" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.803049 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792a6a53-2fbf-4393-8464-f18ead2b290d-operator-scripts\") pod \"glance-db-create-5gpcp\" (UID: \"792a6a53-2fbf-4393-8464-f18ead2b290d\") " pod="openstack/glance-db-create-5gpcp" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.819945 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9czg\" (UniqueName: \"kubernetes.io/projected/792a6a53-2fbf-4393-8464-f18ead2b290d-kube-api-access-d9czg\") pod \"glance-db-create-5gpcp\" (UID: \"792a6a53-2fbf-4393-8464-f18ead2b290d\") " pod="openstack/glance-db-create-5gpcp" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.904675 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7a66dba-9924-4089-a3e5-5aa771f117b5-operator-scripts\") pod \"glance-70d9-account-create-update-c6rqw\" (UID: \"b7a66dba-9924-4089-a3e5-5aa771f117b5\") " pod="openstack/glance-70d9-account-create-update-c6rqw" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.904724 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks4qp\" (UniqueName: \"kubernetes.io/projected/b7a66dba-9924-4089-a3e5-5aa771f117b5-kube-api-access-ks4qp\") pod \"glance-70d9-account-create-update-c6rqw\" (UID: \"b7a66dba-9924-4089-a3e5-5aa771f117b5\") " pod="openstack/glance-70d9-account-create-update-c6rqw" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.933750 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5gpcp" Dec 01 15:19:04 crc kubenswrapper[4931]: I1201 15:19:04.992893 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.006420 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d62g\" (UniqueName: \"kubernetes.io/projected/6957cd98-d749-411e-b8ac-bcfafbdfeb35-kube-api-access-4d62g\") pod \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.006482 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-dns-svc\") pod \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.006519 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-config\") pod \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.006583 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-ovsdbserver-sb\") pod \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\" (UID: \"6957cd98-d749-411e-b8ac-bcfafbdfeb35\") " Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.006772 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7a66dba-9924-4089-a3e5-5aa771f117b5-operator-scripts\") pod \"glance-70d9-account-create-update-c6rqw\" (UID: \"b7a66dba-9924-4089-a3e5-5aa771f117b5\") " pod="openstack/glance-70d9-account-create-update-c6rqw" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.006805 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks4qp\" (UniqueName: \"kubernetes.io/projected/b7a66dba-9924-4089-a3e5-5aa771f117b5-kube-api-access-ks4qp\") pod \"glance-70d9-account-create-update-c6rqw\" (UID: \"b7a66dba-9924-4089-a3e5-5aa771f117b5\") " pod="openstack/glance-70d9-account-create-update-c6rqw" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.012720 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7a66dba-9924-4089-a3e5-5aa771f117b5-operator-scripts\") pod \"glance-70d9-account-create-update-c6rqw\" (UID: \"b7a66dba-9924-4089-a3e5-5aa771f117b5\") " pod="openstack/glance-70d9-account-create-update-c6rqw" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.013156 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6957cd98-d749-411e-b8ac-bcfafbdfeb35-kube-api-access-4d62g" (OuterVolumeSpecName: "kube-api-access-4d62g") pod "6957cd98-d749-411e-b8ac-bcfafbdfeb35" (UID: "6957cd98-d749-411e-b8ac-bcfafbdfeb35"). InnerVolumeSpecName "kube-api-access-4d62g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.032350 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks4qp\" (UniqueName: \"kubernetes.io/projected/b7a66dba-9924-4089-a3e5-5aa771f117b5-kube-api-access-ks4qp\") pod \"glance-70d9-account-create-update-c6rqw\" (UID: \"b7a66dba-9924-4089-a3e5-5aa771f117b5\") " pod="openstack/glance-70d9-account-create-update-c6rqw" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.046524 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-70d9-account-create-update-c6rqw" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.048109 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6957cd98-d749-411e-b8ac-bcfafbdfeb35" (UID: "6957cd98-d749-411e-b8ac-bcfafbdfeb35"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.052426 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-config" (OuterVolumeSpecName: "config") pod "6957cd98-d749-411e-b8ac-bcfafbdfeb35" (UID: "6957cd98-d749-411e-b8ac-bcfafbdfeb35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.076892 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6957cd98-d749-411e-b8ac-bcfafbdfeb35" (UID: "6957cd98-d749-411e-b8ac-bcfafbdfeb35"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.107615 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.107651 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.107661 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6957cd98-d749-411e-b8ac-bcfafbdfeb35-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.107671 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d62g\" (UniqueName: \"kubernetes.io/projected/6957cd98-d749-411e-b8ac-bcfafbdfeb35-kube-api-access-4d62g\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.156918 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-sr224"] Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.279484 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-65sr4"] Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.341743 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-86ad-account-create-update-cjxmb"] Dec 01 15:19:05 crc kubenswrapper[4931]: W1201 15:19:05.352803 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9447449e_ba97_4ec6_b8c6_bdc2b54aa746.slice/crio-4a78c320e2cdc2e21f186f8aea8b8265efe084183e1b9d81511835eee8f068a4 WatchSource:0}: Error finding container 4a78c320e2cdc2e21f186f8aea8b8265efe084183e1b9d81511835eee8f068a4: Status 404 returned error can't find the container with id 4a78c320e2cdc2e21f186f8aea8b8265efe084183e1b9d81511835eee8f068a4 Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.406808 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5gpcp"] Dec 01 15:19:05 crc kubenswrapper[4931]: W1201 15:19:05.413300 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod792a6a53_2fbf_4393_8464_f18ead2b290d.slice/crio-ff1ae86158e564c1aa995c3545f050ca28634a8314ed16ce7c9a3af42f6ae02d WatchSource:0}: Error finding container ff1ae86158e564c1aa995c3545f050ca28634a8314ed16ce7c9a3af42f6ae02d: Status 404 returned error can't find the container with id ff1ae86158e564c1aa995c3545f050ca28634a8314ed16ce7c9a3af42f6ae02d Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.544006 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-70d9-account-create-update-c6rqw"] Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.610795 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sr224" event={"ID":"c8725b44-71db-414e-a092-a684774ccc44","Type":"ContainerStarted","Data":"c86bb0a9442da898b51e6e080f2f1403ab11dacd0195a5eeca7b2571ce0d5115"} Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.610843 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sr224" event={"ID":"c8725b44-71db-414e-a092-a684774ccc44","Type":"ContainerStarted","Data":"f53b90a709afe4750e0e806515ef759199c33a7860f992fa04c589cd4e86c732"} Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.612935 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5gpcp" event={"ID":"792a6a53-2fbf-4393-8464-f18ead2b290d","Type":"ContainerStarted","Data":"4b206d60ad533adca31c307ccb4d0539ee362c2423e697e0fc52abbdabe2aa20"} Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.612981 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5gpcp" event={"ID":"792a6a53-2fbf-4393-8464-f18ead2b290d","Type":"ContainerStarted","Data":"ff1ae86158e564c1aa995c3545f050ca28634a8314ed16ce7c9a3af42f6ae02d"} Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.614859 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-86ad-account-create-update-cjxmb" event={"ID":"9447449e-ba97-4ec6-b8c6-bdc2b54aa746","Type":"ContainerStarted","Data":"d1fa95d1ef0a54347f08ae3286bdbe0c8a631f94b0bd1c090c96edce11e8a2ce"} Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.614896 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-86ad-account-create-update-cjxmb" event={"ID":"9447449e-ba97-4ec6-b8c6-bdc2b54aa746","Type":"ContainerStarted","Data":"4a78c320e2cdc2e21f186f8aea8b8265efe084183e1b9d81511835eee8f068a4"} Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.619242 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" event={"ID":"6957cd98-d749-411e-b8ac-bcfafbdfeb35","Type":"ContainerDied","Data":"84407efb71b45de01fd41879dd7abf923367dc6d171608855733f039ba346657"} Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.619264 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-tnvwb" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.619306 4931 scope.go:117] "RemoveContainer" containerID="20af9f66173ba7da9fa0b80affe1e95a4b053883fd44c3328f89ee4785cd7e66" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.620850 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-70d9-account-create-update-c6rqw" event={"ID":"b7a66dba-9924-4089-a3e5-5aa771f117b5","Type":"ContainerStarted","Data":"1b37891e5e7a3806646fee26af1a1522d4dff06c8852f3edbb5386a7550b2fb3"} Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.624338 4931 generic.go:334] "Generic (PLEG): container finished" podID="27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c" containerID="3d23c379a6890da2744f5195f6128f44d756b055e73126e3e30e8dd0f1745fdb" exitCode=0 Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.624418 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-788c-account-create-update-fg8q4" event={"ID":"27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c","Type":"ContainerDied","Data":"3d23c379a6890da2744f5195f6128f44d756b055e73126e3e30e8dd0f1745fdb"} Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.624445 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-788c-account-create-update-fg8q4" event={"ID":"27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c","Type":"ContainerStarted","Data":"eec71bf295823d6c69a1698d0236d3d1866c70bfafaa3c026e6f6428e2477d29"} Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.634163 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-sr224" podStartSLOduration=1.634145172 podStartE2EDuration="1.634145172s" podCreationTimestamp="2025-12-01 15:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:05.627011728 +0000 UTC m=+1092.052885395" watchObservedRunningTime="2025-12-01 15:19:05.634145172 +0000 UTC m=+1092.060018839" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.636447 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-65sr4" event={"ID":"bce0b7ec-9061-4965-8139-ebf2da3036be","Type":"ContainerStarted","Data":"fdd7a4c7a92ba9cb758e88893b2f1c5f62dba4a3c7e9298106be6262e54a308e"} Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.636500 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-65sr4" event={"ID":"bce0b7ec-9061-4965-8139-ebf2da3036be","Type":"ContainerStarted","Data":"9324a30c2de23a442b299afb91b1d04b38e24a0c3bd86dc8510cc921b5f4a7d7"} Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.642829 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-86ad-account-create-update-cjxmb" podStartSLOduration=1.642814419 podStartE2EDuration="1.642814419s" podCreationTimestamp="2025-12-01 15:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:05.642279494 +0000 UTC m=+1092.068153161" watchObservedRunningTime="2025-12-01 15:19:05.642814419 +0000 UTC m=+1092.068688086" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.654539 4931 scope.go:117] "RemoveContainer" containerID="3940395c07d6cfa2099bdada3fb495edd64a60525703d81388fe3cbeeec6519a" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.681898 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-5gpcp" podStartSLOduration=1.681882154 podStartE2EDuration="1.681882154s" podCreationTimestamp="2025-12-01 15:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:05.678375014 +0000 UTC m=+1092.104248681" watchObservedRunningTime="2025-12-01 15:19:05.681882154 +0000 UTC m=+1092.107755821" Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.702846 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-tnvwb"] Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.710836 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-tnvwb"] Dec 01 15:19:05 crc kubenswrapper[4931]: I1201 15:19:05.712236 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-65sr4" podStartSLOduration=2.7122162100000002 podStartE2EDuration="2.71221621s" podCreationTimestamp="2025-12-01 15:19:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:05.700893367 +0000 UTC m=+1092.126767034" watchObservedRunningTime="2025-12-01 15:19:05.71221621 +0000 UTC m=+1092.138089877" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.251738 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6957cd98-d749-411e-b8ac-bcfafbdfeb35" path="/var/lib/kubelet/pods/6957cd98-d749-411e-b8ac-bcfafbdfeb35/volumes" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.417541 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zvwkt"] Dec 01 15:19:06 crc kubenswrapper[4931]: E1201 15:19:06.421111 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6957cd98-d749-411e-b8ac-bcfafbdfeb35" containerName="init" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.421145 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6957cd98-d749-411e-b8ac-bcfafbdfeb35" containerName="init" Dec 01 15:19:06 crc kubenswrapper[4931]: E1201 15:19:06.421177 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6957cd98-d749-411e-b8ac-bcfafbdfeb35" containerName="dnsmasq-dns" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.421185 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6957cd98-d749-411e-b8ac-bcfafbdfeb35" containerName="dnsmasq-dns" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.421420 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6957cd98-d749-411e-b8ac-bcfafbdfeb35" containerName="dnsmasq-dns" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.422452 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.442458 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zvwkt"] Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.529202 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.529282 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wt5z\" (UniqueName: \"kubernetes.io/projected/f61f6fb9-c1c7-4feb-a851-711b9f093142-kube-api-access-7wt5z\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.529313 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.529335 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-config\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.529441 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.630767 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.630856 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wt5z\" (UniqueName: \"kubernetes.io/projected/f61f6fb9-c1c7-4feb-a851-711b9f093142-kube-api-access-7wt5z\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.630894 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.630921 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-config\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.631051 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.632063 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.633810 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-config\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.636221 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.636952 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.647246 4931 generic.go:334] "Generic (PLEG): container finished" podID="9447449e-ba97-4ec6-b8c6-bdc2b54aa746" containerID="d1fa95d1ef0a54347f08ae3286bdbe0c8a631f94b0bd1c090c96edce11e8a2ce" exitCode=0 Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.647311 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-86ad-account-create-update-cjxmb" event={"ID":"9447449e-ba97-4ec6-b8c6-bdc2b54aa746","Type":"ContainerDied","Data":"d1fa95d1ef0a54347f08ae3286bdbe0c8a631f94b0bd1c090c96edce11e8a2ce"} Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.650330 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wt5z\" (UniqueName: \"kubernetes.io/projected/f61f6fb9-c1c7-4feb-a851-711b9f093142-kube-api-access-7wt5z\") pod \"dnsmasq-dns-b8fbc5445-zvwkt\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.697033 4931 generic.go:334] "Generic (PLEG): container finished" podID="b7a66dba-9924-4089-a3e5-5aa771f117b5" containerID="f0325033cc8c49f4325bd511aff1e26e25ca14fb3bd79e8105d0bb277ffbc81e" exitCode=0 Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.697088 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-70d9-account-create-update-c6rqw" event={"ID":"b7a66dba-9924-4089-a3e5-5aa771f117b5","Type":"ContainerDied","Data":"f0325033cc8c49f4325bd511aff1e26e25ca14fb3bd79e8105d0bb277ffbc81e"} Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.699066 4931 generic.go:334] "Generic (PLEG): container finished" podID="c8725b44-71db-414e-a092-a684774ccc44" containerID="c86bb0a9442da898b51e6e080f2f1403ab11dacd0195a5eeca7b2571ce0d5115" exitCode=0 Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.699148 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sr224" event={"ID":"c8725b44-71db-414e-a092-a684774ccc44","Type":"ContainerDied","Data":"c86bb0a9442da898b51e6e080f2f1403ab11dacd0195a5eeca7b2571ce0d5115"} Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.700019 4931 generic.go:334] "Generic (PLEG): container finished" podID="bce0b7ec-9061-4965-8139-ebf2da3036be" containerID="fdd7a4c7a92ba9cb758e88893b2f1c5f62dba4a3c7e9298106be6262e54a308e" exitCode=0 Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.700066 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-65sr4" event={"ID":"bce0b7ec-9061-4965-8139-ebf2da3036be","Type":"ContainerDied","Data":"fdd7a4c7a92ba9cb758e88893b2f1c5f62dba4a3c7e9298106be6262e54a308e"} Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.700968 4931 generic.go:334] "Generic (PLEG): container finished" podID="792a6a53-2fbf-4393-8464-f18ead2b290d" containerID="4b206d60ad533adca31c307ccb4d0539ee362c2423e697e0fc52abbdabe2aa20" exitCode=0 Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.701105 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5gpcp" event={"ID":"792a6a53-2fbf-4393-8464-f18ead2b290d","Type":"ContainerDied","Data":"4b206d60ad533adca31c307ccb4d0539ee362c2423e697e0fc52abbdabe2aa20"} Dec 01 15:19:06 crc kubenswrapper[4931]: I1201 15:19:06.744552 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.007991 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-788c-account-create-update-fg8q4" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.139698 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c-operator-scripts\") pod \"27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c\" (UID: \"27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c\") " Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.139915 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4wm6\" (UniqueName: \"kubernetes.io/projected/27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c-kube-api-access-g4wm6\") pod \"27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c\" (UID: \"27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c\") " Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.140600 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c" (UID: "27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.144677 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c-kube-api-access-g4wm6" (OuterVolumeSpecName: "kube-api-access-g4wm6") pod "27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c" (UID: "27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c"). InnerVolumeSpecName "kube-api-access-g4wm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.192273 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zvwkt"] Dec 01 15:19:07 crc kubenswrapper[4931]: W1201 15:19:07.198161 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf61f6fb9_c1c7_4feb_a851_711b9f093142.slice/crio-2a997ffe72072ef8c22243d727273db31d3280d49d6e1772718974d71002c1ce WatchSource:0}: Error finding container 2a997ffe72072ef8c22243d727273db31d3280d49d6e1772718974d71002c1ce: Status 404 returned error can't find the container with id 2a997ffe72072ef8c22243d727273db31d3280d49d6e1772718974d71002c1ce Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.241843 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4wm6\" (UniqueName: \"kubernetes.io/projected/27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c-kube-api-access-g4wm6\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.242041 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.267598 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.481416 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 01 15:19:07 crc kubenswrapper[4931]: E1201 15:19:07.481719 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c" containerName="mariadb-account-create-update" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.481731 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c" containerName="mariadb-account-create-update" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.481902 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c" containerName="mariadb-account-create-update" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.486529 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.488670 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.488689 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-6kg25" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.492765 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.493398 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.501318 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.654032 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.654563 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np24t\" (UniqueName: \"kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-kube-api-access-np24t\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.654589 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fe036b57-6753-42af-ad39-195f0688532d-cache\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.654667 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.654686 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fe036b57-6753-42af-ad39-195f0688532d-lock\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.714150 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-788c-account-create-update-fg8q4" event={"ID":"27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c","Type":"ContainerDied","Data":"eec71bf295823d6c69a1698d0236d3d1866c70bfafaa3c026e6f6428e2477d29"} Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.714187 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eec71bf295823d6c69a1698d0236d3d1866c70bfafaa3c026e6f6428e2477d29" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.714202 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-788c-account-create-update-fg8q4" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.715765 4931 generic.go:334] "Generic (PLEG): container finished" podID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerID="58d78ec23db06d08e250939e3d9d32c542a0768487415f900d795c9ac71e1813" exitCode=0 Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.715790 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" event={"ID":"f61f6fb9-c1c7-4feb-a851-711b9f093142","Type":"ContainerDied","Data":"58d78ec23db06d08e250939e3d9d32c542a0768487415f900d795c9ac71e1813"} Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.715811 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" event={"ID":"f61f6fb9-c1c7-4feb-a851-711b9f093142","Type":"ContainerStarted","Data":"2a997ffe72072ef8c22243d727273db31d3280d49d6e1772718974d71002c1ce"} Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.756082 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.756132 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fe036b57-6753-42af-ad39-195f0688532d-lock\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.756226 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.756286 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np24t\" (UniqueName: \"kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-kube-api-access-np24t\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.756314 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fe036b57-6753-42af-ad39-195f0688532d-cache\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.756796 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fe036b57-6753-42af-ad39-195f0688532d-cache\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.756891 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.757117 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/fe036b57-6753-42af-ad39-195f0688532d-lock\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: E1201 15:19:07.757218 4931 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 15:19:07 crc kubenswrapper[4931]: E1201 15:19:07.757233 4931 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 15:19:07 crc kubenswrapper[4931]: E1201 15:19:07.757268 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift podName:fe036b57-6753-42af-ad39-195f0688532d nodeName:}" failed. No retries permitted until 2025-12-01 15:19:08.257253986 +0000 UTC m=+1094.683127653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift") pod "swift-storage-0" (UID: "fe036b57-6753-42af-ad39-195f0688532d") : configmap "swift-ring-files" not found Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.781564 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np24t\" (UniqueName: \"kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-kube-api-access-np24t\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:07 crc kubenswrapper[4931]: I1201 15:19:07.784577 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.092983 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-65sr4" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.265638 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce0b7ec-9061-4965-8139-ebf2da3036be-operator-scripts\") pod \"bce0b7ec-9061-4965-8139-ebf2da3036be\" (UID: \"bce0b7ec-9061-4965-8139-ebf2da3036be\") " Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.265775 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk26h\" (UniqueName: \"kubernetes.io/projected/bce0b7ec-9061-4965-8139-ebf2da3036be-kube-api-access-jk26h\") pod \"bce0b7ec-9061-4965-8139-ebf2da3036be\" (UID: \"bce0b7ec-9061-4965-8139-ebf2da3036be\") " Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.266046 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bce0b7ec-9061-4965-8139-ebf2da3036be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bce0b7ec-9061-4965-8139-ebf2da3036be" (UID: "bce0b7ec-9061-4965-8139-ebf2da3036be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.266214 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:08 crc kubenswrapper[4931]: E1201 15:19:08.266362 4931 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 15:19:08 crc kubenswrapper[4931]: E1201 15:19:08.266374 4931 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 15:19:08 crc kubenswrapper[4931]: E1201 15:19:08.266445 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift podName:fe036b57-6753-42af-ad39-195f0688532d nodeName:}" failed. No retries permitted until 2025-12-01 15:19:09.266432268 +0000 UTC m=+1095.692305935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift") pod "swift-storage-0" (UID: "fe036b57-6753-42af-ad39-195f0688532d") : configmap "swift-ring-files" not found Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.266961 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce0b7ec-9061-4965-8139-ebf2da3036be-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.271289 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce0b7ec-9061-4965-8139-ebf2da3036be-kube-api-access-jk26h" (OuterVolumeSpecName: "kube-api-access-jk26h") pod "bce0b7ec-9061-4965-8139-ebf2da3036be" (UID: "bce0b7ec-9061-4965-8139-ebf2da3036be"). InnerVolumeSpecName "kube-api-access-jk26h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.339833 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sr224" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.348843 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-86ad-account-create-update-cjxmb" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.360179 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5gpcp" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.377586 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk26h\" (UniqueName: \"kubernetes.io/projected/bce0b7ec-9061-4965-8139-ebf2da3036be-kube-api-access-jk26h\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.382927 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-70d9-account-create-update-c6rqw" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.478229 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9czg\" (UniqueName: \"kubernetes.io/projected/792a6a53-2fbf-4393-8464-f18ead2b290d-kube-api-access-d9czg\") pod \"792a6a53-2fbf-4393-8464-f18ead2b290d\" (UID: \"792a6a53-2fbf-4393-8464-f18ead2b290d\") " Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.478631 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbz6c\" (UniqueName: \"kubernetes.io/projected/9447449e-ba97-4ec6-b8c6-bdc2b54aa746-kube-api-access-kbz6c\") pod \"9447449e-ba97-4ec6-b8c6-bdc2b54aa746\" (UID: \"9447449e-ba97-4ec6-b8c6-bdc2b54aa746\") " Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.478657 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9447449e-ba97-4ec6-b8c6-bdc2b54aa746-operator-scripts\") pod \"9447449e-ba97-4ec6-b8c6-bdc2b54aa746\" (UID: \"9447449e-ba97-4ec6-b8c6-bdc2b54aa746\") " Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.478825 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8725b44-71db-414e-a092-a684774ccc44-operator-scripts\") pod \"c8725b44-71db-414e-a092-a684774ccc44\" (UID: \"c8725b44-71db-414e-a092-a684774ccc44\") " Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.478862 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792a6a53-2fbf-4393-8464-f18ead2b290d-operator-scripts\") pod \"792a6a53-2fbf-4393-8464-f18ead2b290d\" (UID: \"792a6a53-2fbf-4393-8464-f18ead2b290d\") " Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.478888 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x48m8\" (UniqueName: \"kubernetes.io/projected/c8725b44-71db-414e-a092-a684774ccc44-kube-api-access-x48m8\") pod \"c8725b44-71db-414e-a092-a684774ccc44\" (UID: \"c8725b44-71db-414e-a092-a684774ccc44\") " Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.479307 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9447449e-ba97-4ec6-b8c6-bdc2b54aa746-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9447449e-ba97-4ec6-b8c6-bdc2b54aa746" (UID: "9447449e-ba97-4ec6-b8c6-bdc2b54aa746"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.479336 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/792a6a53-2fbf-4393-8464-f18ead2b290d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "792a6a53-2fbf-4393-8464-f18ead2b290d" (UID: "792a6a53-2fbf-4393-8464-f18ead2b290d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.479396 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8725b44-71db-414e-a092-a684774ccc44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8725b44-71db-414e-a092-a684774ccc44" (UID: "c8725b44-71db-414e-a092-a684774ccc44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.482515 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9447449e-ba97-4ec6-b8c6-bdc2b54aa746-kube-api-access-kbz6c" (OuterVolumeSpecName: "kube-api-access-kbz6c") pod "9447449e-ba97-4ec6-b8c6-bdc2b54aa746" (UID: "9447449e-ba97-4ec6-b8c6-bdc2b54aa746"). InnerVolumeSpecName "kube-api-access-kbz6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.482813 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792a6a53-2fbf-4393-8464-f18ead2b290d-kube-api-access-d9czg" (OuterVolumeSpecName: "kube-api-access-d9czg") pod "792a6a53-2fbf-4393-8464-f18ead2b290d" (UID: "792a6a53-2fbf-4393-8464-f18ead2b290d"). InnerVolumeSpecName "kube-api-access-d9czg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.488111 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8725b44-71db-414e-a092-a684774ccc44-kube-api-access-x48m8" (OuterVolumeSpecName: "kube-api-access-x48m8") pod "c8725b44-71db-414e-a092-a684774ccc44" (UID: "c8725b44-71db-414e-a092-a684774ccc44"). InnerVolumeSpecName "kube-api-access-x48m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.580676 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7a66dba-9924-4089-a3e5-5aa771f117b5-operator-scripts\") pod \"b7a66dba-9924-4089-a3e5-5aa771f117b5\" (UID: \"b7a66dba-9924-4089-a3e5-5aa771f117b5\") " Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.580775 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks4qp\" (UniqueName: \"kubernetes.io/projected/b7a66dba-9924-4089-a3e5-5aa771f117b5-kube-api-access-ks4qp\") pod \"b7a66dba-9924-4089-a3e5-5aa771f117b5\" (UID: \"b7a66dba-9924-4089-a3e5-5aa771f117b5\") " Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.581298 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbz6c\" (UniqueName: \"kubernetes.io/projected/9447449e-ba97-4ec6-b8c6-bdc2b54aa746-kube-api-access-kbz6c\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.581319 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9447449e-ba97-4ec6-b8c6-bdc2b54aa746-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.581329 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8725b44-71db-414e-a092-a684774ccc44-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.581340 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/792a6a53-2fbf-4393-8464-f18ead2b290d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.581378 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x48m8\" (UniqueName: \"kubernetes.io/projected/c8725b44-71db-414e-a092-a684774ccc44-kube-api-access-x48m8\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.581415 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9czg\" (UniqueName: \"kubernetes.io/projected/792a6a53-2fbf-4393-8464-f18ead2b290d-kube-api-access-d9czg\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.581734 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7a66dba-9924-4089-a3e5-5aa771f117b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7a66dba-9924-4089-a3e5-5aa771f117b5" (UID: "b7a66dba-9924-4089-a3e5-5aa771f117b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.584788 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a66dba-9924-4089-a3e5-5aa771f117b5-kube-api-access-ks4qp" (OuterVolumeSpecName: "kube-api-access-ks4qp") pod "b7a66dba-9924-4089-a3e5-5aa771f117b5" (UID: "b7a66dba-9924-4089-a3e5-5aa771f117b5"). InnerVolumeSpecName "kube-api-access-ks4qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.683181 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7a66dba-9924-4089-a3e5-5aa771f117b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.683220 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks4qp\" (UniqueName: \"kubernetes.io/projected/b7a66dba-9924-4089-a3e5-5aa771f117b5-kube-api-access-ks4qp\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.725569 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5gpcp" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.725587 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5gpcp" event={"ID":"792a6a53-2fbf-4393-8464-f18ead2b290d","Type":"ContainerDied","Data":"ff1ae86158e564c1aa995c3545f050ca28634a8314ed16ce7c9a3af42f6ae02d"} Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.725617 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff1ae86158e564c1aa995c3545f050ca28634a8314ed16ce7c9a3af42f6ae02d" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.726952 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-86ad-account-create-update-cjxmb" event={"ID":"9447449e-ba97-4ec6-b8c6-bdc2b54aa746","Type":"ContainerDied","Data":"4a78c320e2cdc2e21f186f8aea8b8265efe084183e1b9d81511835eee8f068a4"} Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.727006 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-86ad-account-create-update-cjxmb" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.727024 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a78c320e2cdc2e21f186f8aea8b8265efe084183e1b9d81511835eee8f068a4" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.728830 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" event={"ID":"f61f6fb9-c1c7-4feb-a851-711b9f093142","Type":"ContainerStarted","Data":"8e8e3d00dd4510d8c026119eb82ad7d32245c7b05317be8aad83da5e86e754ef"} Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.729205 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.730114 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-70d9-account-create-update-c6rqw" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.730102 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-70d9-account-create-update-c6rqw" event={"ID":"b7a66dba-9924-4089-a3e5-5aa771f117b5","Type":"ContainerDied","Data":"1b37891e5e7a3806646fee26af1a1522d4dff06c8852f3edbb5386a7550b2fb3"} Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.730274 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b37891e5e7a3806646fee26af1a1522d4dff06c8852f3edbb5386a7550b2fb3" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.731683 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-65sr4" event={"ID":"bce0b7ec-9061-4965-8139-ebf2da3036be","Type":"ContainerDied","Data":"9324a30c2de23a442b299afb91b1d04b38e24a0c3bd86dc8510cc921b5f4a7d7"} Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.731704 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9324a30c2de23a442b299afb91b1d04b38e24a0c3bd86dc8510cc921b5f4a7d7" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.731771 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-65sr4" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.733171 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-sr224" event={"ID":"c8725b44-71db-414e-a092-a684774ccc44","Type":"ContainerDied","Data":"f53b90a709afe4750e0e806515ef759199c33a7860f992fa04c589cd4e86c732"} Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.733190 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f53b90a709afe4750e0e806515ef759199c33a7860f992fa04c589cd4e86c732" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.733347 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-sr224" Dec 01 15:19:08 crc kubenswrapper[4931]: I1201 15:19:08.764862 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" podStartSLOduration=2.764842242 podStartE2EDuration="2.764842242s" podCreationTimestamp="2025-12-01 15:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:08.750895134 +0000 UTC m=+1095.176768801" watchObservedRunningTime="2025-12-01 15:19:08.764842242 +0000 UTC m=+1095.190715919" Dec 01 15:19:09 crc kubenswrapper[4931]: E1201 15:19:09.296247 4931 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 15:19:09 crc kubenswrapper[4931]: E1201 15:19:09.296651 4931 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 15:19:09 crc kubenswrapper[4931]: E1201 15:19:09.296706 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift podName:fe036b57-6753-42af-ad39-195f0688532d nodeName:}" failed. No retries permitted until 2025-12-01 15:19:11.29668845 +0000 UTC m=+1097.722562117 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift") pod "swift-storage-0" (UID: "fe036b57-6753-42af-ad39-195f0688532d") : configmap "swift-ring-files" not found Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.296558 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.849813 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-xnxrs"] Dec 01 15:19:09 crc kubenswrapper[4931]: E1201 15:19:09.850232 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8725b44-71db-414e-a092-a684774ccc44" containerName="mariadb-database-create" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.850250 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8725b44-71db-414e-a092-a684774ccc44" containerName="mariadb-database-create" Dec 01 15:19:09 crc kubenswrapper[4931]: E1201 15:19:09.850263 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a66dba-9924-4089-a3e5-5aa771f117b5" containerName="mariadb-account-create-update" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.850271 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a66dba-9924-4089-a3e5-5aa771f117b5" containerName="mariadb-account-create-update" Dec 01 15:19:09 crc kubenswrapper[4931]: E1201 15:19:09.850286 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce0b7ec-9061-4965-8139-ebf2da3036be" containerName="mariadb-database-create" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.850294 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce0b7ec-9061-4965-8139-ebf2da3036be" containerName="mariadb-database-create" Dec 01 15:19:09 crc kubenswrapper[4931]: E1201 15:19:09.850302 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792a6a53-2fbf-4393-8464-f18ead2b290d" containerName="mariadb-database-create" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.850307 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="792a6a53-2fbf-4393-8464-f18ead2b290d" containerName="mariadb-database-create" Dec 01 15:19:09 crc kubenswrapper[4931]: E1201 15:19:09.850341 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9447449e-ba97-4ec6-b8c6-bdc2b54aa746" containerName="mariadb-account-create-update" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.850351 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9447449e-ba97-4ec6-b8c6-bdc2b54aa746" containerName="mariadb-account-create-update" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.850518 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a66dba-9924-4089-a3e5-5aa771f117b5" containerName="mariadb-account-create-update" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.850536 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9447449e-ba97-4ec6-b8c6-bdc2b54aa746" containerName="mariadb-account-create-update" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.850544 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="792a6a53-2fbf-4393-8464-f18ead2b290d" containerName="mariadb-database-create" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.850558 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8725b44-71db-414e-a092-a684774ccc44" containerName="mariadb-database-create" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.850566 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce0b7ec-9061-4965-8139-ebf2da3036be" containerName="mariadb-database-create" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.851104 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.854147 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.854250 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6wcpc" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.864207 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xnxrs"] Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.907244 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srrp8\" (UniqueName: \"kubernetes.io/projected/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-kube-api-access-srrp8\") pod \"glance-db-sync-xnxrs\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.907334 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-config-data\") pod \"glance-db-sync-xnxrs\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.907360 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-combined-ca-bundle\") pod \"glance-db-sync-xnxrs\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:09 crc kubenswrapper[4931]: I1201 15:19:09.907453 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-db-sync-config-data\") pod \"glance-db-sync-xnxrs\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:10 crc kubenswrapper[4931]: I1201 15:19:10.008124 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-db-sync-config-data\") pod \"glance-db-sync-xnxrs\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:10 crc kubenswrapper[4931]: I1201 15:19:10.008219 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srrp8\" (UniqueName: \"kubernetes.io/projected/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-kube-api-access-srrp8\") pod \"glance-db-sync-xnxrs\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:10 crc kubenswrapper[4931]: I1201 15:19:10.008277 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-config-data\") pod \"glance-db-sync-xnxrs\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:10 crc kubenswrapper[4931]: I1201 15:19:10.008301 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-combined-ca-bundle\") pod \"glance-db-sync-xnxrs\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:10 crc kubenswrapper[4931]: I1201 15:19:10.019210 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-db-sync-config-data\") pod \"glance-db-sync-xnxrs\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:10 crc kubenswrapper[4931]: I1201 15:19:10.019899 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-combined-ca-bundle\") pod \"glance-db-sync-xnxrs\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:10 crc kubenswrapper[4931]: I1201 15:19:10.020171 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-config-data\") pod \"glance-db-sync-xnxrs\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:10 crc kubenswrapper[4931]: I1201 15:19:10.023826 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srrp8\" (UniqueName: \"kubernetes.io/projected/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-kube-api-access-srrp8\") pod \"glance-db-sync-xnxrs\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:10 crc kubenswrapper[4931]: I1201 15:19:10.169844 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:10 crc kubenswrapper[4931]: I1201 15:19:10.747272 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xnxrs"] Dec 01 15:19:10 crc kubenswrapper[4931]: W1201 15:19:10.752415 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf23b7a9b_3694_415b_8e68_d0757a5ccc7a.slice/crio-3622988efc42f0d8cc146cd05fbcfc951d48dfec73167b903a424808e31218b9 WatchSource:0}: Error finding container 3622988efc42f0d8cc146cd05fbcfc951d48dfec73167b903a424808e31218b9: Status 404 returned error can't find the container with id 3622988efc42f0d8cc146cd05fbcfc951d48dfec73167b903a424808e31218b9 Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.336158 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:11 crc kubenswrapper[4931]: E1201 15:19:11.336319 4931 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 15:19:11 crc kubenswrapper[4931]: E1201 15:19:11.336340 4931 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 15:19:11 crc kubenswrapper[4931]: E1201 15:19:11.336414 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift podName:fe036b57-6753-42af-ad39-195f0688532d nodeName:}" failed. No retries permitted until 2025-12-01 15:19:15.336377943 +0000 UTC m=+1101.762251610 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift") pod "swift-storage-0" (UID: "fe036b57-6753-42af-ad39-195f0688532d") : configmap "swift-ring-files" not found Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.494338 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-lbvg4"] Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.496643 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.501618 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.502032 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.502044 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.523293 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-lbvg4"] Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.538121 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-swiftconf\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.538212 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-combined-ca-bundle\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.538291 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-scripts\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.538346 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-etc-swift\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.538374 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbwzs\" (UniqueName: \"kubernetes.io/projected/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-kube-api-access-tbwzs\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.538460 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-ring-data-devices\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.538550 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-dispersionconf\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.640203 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-scripts\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.640618 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-etc-swift\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.640650 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbwzs\" (UniqueName: \"kubernetes.io/projected/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-kube-api-access-tbwzs\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.640688 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-ring-data-devices\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.640733 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-dispersionconf\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.640770 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-swiftconf\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.640792 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-combined-ca-bundle\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.642456 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-etc-swift\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.642626 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-ring-data-devices\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.642686 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-scripts\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.655112 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-swiftconf\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.661270 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-combined-ca-bundle\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.661583 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-dispersionconf\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.670050 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbwzs\" (UniqueName: \"kubernetes.io/projected/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-kube-api-access-tbwzs\") pod \"swift-ring-rebalance-lbvg4\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.756179 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnxrs" event={"ID":"f23b7a9b-3694-415b-8e68-d0757a5ccc7a","Type":"ContainerStarted","Data":"3622988efc42f0d8cc146cd05fbcfc951d48dfec73167b903a424808e31218b9"} Dec 01 15:19:11 crc kubenswrapper[4931]: I1201 15:19:11.817295 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:12 crc kubenswrapper[4931]: W1201 15:19:12.238125 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec217c21_8698_4dd3_a58d_b2626cbfbaf2.slice/crio-b6e88870ba4cbe8547807308025faf9ed80eb64e92cd44a8a7c0d1d503641103 WatchSource:0}: Error finding container b6e88870ba4cbe8547807308025faf9ed80eb64e92cd44a8a7c0d1d503641103: Status 404 returned error can't find the container with id b6e88870ba4cbe8547807308025faf9ed80eb64e92cd44a8a7c0d1d503641103 Dec 01 15:19:12 crc kubenswrapper[4931]: I1201 15:19:12.238575 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-lbvg4"] Dec 01 15:19:12 crc kubenswrapper[4931]: I1201 15:19:12.764897 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lbvg4" event={"ID":"ec217c21-8698-4dd3-a58d-b2626cbfbaf2","Type":"ContainerStarted","Data":"b6e88870ba4cbe8547807308025faf9ed80eb64e92cd44a8a7c0d1d503641103"} Dec 01 15:19:13 crc kubenswrapper[4931]: I1201 15:19:13.774942 4931 generic.go:334] "Generic (PLEG): container finished" podID="6f8b18d2-d611-4ad6-850a-4ad19544c016" containerID="c4f945b9f807d6cde63f94d15690bb2e2f0ad176652953dc94b9fb21983d50d3" exitCode=0 Dec 01 15:19:13 crc kubenswrapper[4931]: I1201 15:19:13.775140 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f8b18d2-d611-4ad6-850a-4ad19544c016","Type":"ContainerDied","Data":"c4f945b9f807d6cde63f94d15690bb2e2f0ad176652953dc94b9fb21983d50d3"} Dec 01 15:19:14 crc kubenswrapper[4931]: I1201 15:19:14.783780 4931 generic.go:334] "Generic (PLEG): container finished" podID="a675ebc0-8c3b-4c43-884f-b32bd954ac6e" containerID="44f0e47d74078ec60006b9d6439ff27887fc34864e6507dde0fec8afdcf53cff" exitCode=0 Dec 01 15:19:14 crc kubenswrapper[4931]: I1201 15:19:14.783839 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a675ebc0-8c3b-4c43-884f-b32bd954ac6e","Type":"ContainerDied","Data":"44f0e47d74078ec60006b9d6439ff27887fc34864e6507dde0fec8afdcf53cff"} Dec 01 15:19:15 crc kubenswrapper[4931]: I1201 15:19:15.430410 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:15 crc kubenswrapper[4931]: E1201 15:19:15.430644 4931 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 15:19:15 crc kubenswrapper[4931]: E1201 15:19:15.430870 4931 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 15:19:15 crc kubenswrapper[4931]: E1201 15:19:15.430929 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift podName:fe036b57-6753-42af-ad39-195f0688532d nodeName:}" failed. No retries permitted until 2025-12-01 15:19:23.43091312 +0000 UTC m=+1109.856786787 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift") pod "swift-storage-0" (UID: "fe036b57-6753-42af-ad39-195f0688532d") : configmap "swift-ring-files" not found Dec 01 15:19:16 crc kubenswrapper[4931]: I1201 15:19:16.754652 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:19:16 crc kubenswrapper[4931]: I1201 15:19:16.810963 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-th96h"] Dec 01 15:19:16 crc kubenswrapper[4931]: I1201 15:19:16.811246 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-th96h" podUID="66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" containerName="dnsmasq-dns" containerID="cri-o://caa800ac4c84c83e9417dadc3c2b16e0455be1f8b54108ba8224c7389f988d79" gracePeriod=10 Dec 01 15:19:17 crc kubenswrapper[4931]: I1201 15:19:17.829318 4931 generic.go:334] "Generic (PLEG): container finished" podID="66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" containerID="caa800ac4c84c83e9417dadc3c2b16e0455be1f8b54108ba8224c7389f988d79" exitCode=0 Dec 01 15:19:17 crc kubenswrapper[4931]: I1201 15:19:17.829371 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-th96h" event={"ID":"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e","Type":"ContainerDied","Data":"caa800ac4c84c83e9417dadc3c2b16e0455be1f8b54108ba8224c7389f988d79"} Dec 01 15:19:17 crc kubenswrapper[4931]: I1201 15:19:17.832220 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f8b18d2-d611-4ad6-850a-4ad19544c016","Type":"ContainerStarted","Data":"46f10de2c518f8c69c966deba43efad45397695eb10c52be47377627996f1270"} Dec 01 15:19:17 crc kubenswrapper[4931]: I1201 15:19:17.832451 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 15:19:17 crc kubenswrapper[4931]: I1201 15:19:17.835030 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a675ebc0-8c3b-4c43-884f-b32bd954ac6e","Type":"ContainerStarted","Data":"daa265a5e267895db3fb7f5c0262a02c5a6287dc51eafab8b85e220f1bc8011e"} Dec 01 15:19:17 crc kubenswrapper[4931]: I1201 15:19:17.835367 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:19:17 crc kubenswrapper[4931]: I1201 15:19:17.854469 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.214934333 podStartE2EDuration="1m8.854448917s" podCreationTimestamp="2025-12-01 15:18:09 +0000 UTC" firstStartedPulling="2025-12-01 15:18:11.507111079 +0000 UTC m=+1037.932984746" lastFinishedPulling="2025-12-01 15:18:38.146625663 +0000 UTC m=+1064.572499330" observedRunningTime="2025-12-01 15:19:17.848784255 +0000 UTC m=+1104.274657922" watchObservedRunningTime="2025-12-01 15:19:17.854448917 +0000 UTC m=+1104.280322584" Dec 01 15:19:17 crc kubenswrapper[4931]: I1201 15:19:17.875123 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.253219904 podStartE2EDuration="1m8.875103537s" podCreationTimestamp="2025-12-01 15:18:09 +0000 UTC" firstStartedPulling="2025-12-01 15:18:11.690712556 +0000 UTC m=+1038.116586223" lastFinishedPulling="2025-12-01 15:18:40.312596189 +0000 UTC m=+1066.738469856" observedRunningTime="2025-12-01 15:19:17.873145351 +0000 UTC m=+1104.299019028" watchObservedRunningTime="2025-12-01 15:19:17.875103537 +0000 UTC m=+1104.300977194" Dec 01 15:19:19 crc kubenswrapper[4931]: I1201 15:19:19.793363 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-v8h85" podUID="6f943374-baa7-4200-93ff-6773c58b032d" containerName="ovn-controller" probeResult="failure" output=< Dec 01 15:19:19 crc kubenswrapper[4931]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 15:19:19 crc kubenswrapper[4931]: > Dec 01 15:19:19 crc kubenswrapper[4931]: I1201 15:19:19.881200 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:19:19 crc kubenswrapper[4931]: I1201 15:19:19.898452 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cgg9p" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.146749 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v8h85-config-8rx4b"] Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.147669 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.150185 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.164161 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v8h85-config-8rx4b"] Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.310253 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-run\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.310295 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3de09b9-898d-4f1e-8de7-d8e84efe0734-scripts\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.310314 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-log-ovn\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.310334 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-run-ovn\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.310359 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gk2t\" (UniqueName: \"kubernetes.io/projected/c3de09b9-898d-4f1e-8de7-d8e84efe0734-kube-api-access-7gk2t\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.310499 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3de09b9-898d-4f1e-8de7-d8e84efe0734-additional-scripts\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.411635 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-log-ovn\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.411677 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3de09b9-898d-4f1e-8de7-d8e84efe0734-scripts\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.411719 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-run-ovn\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.411747 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gk2t\" (UniqueName: \"kubernetes.io/projected/c3de09b9-898d-4f1e-8de7-d8e84efe0734-kube-api-access-7gk2t\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.411852 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3de09b9-898d-4f1e-8de7-d8e84efe0734-additional-scripts\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.411899 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-run\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.412145 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-run\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.412198 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-log-ovn\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.413607 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-run-ovn\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.414062 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3de09b9-898d-4f1e-8de7-d8e84efe0734-scripts\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.414505 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3de09b9-898d-4f1e-8de7-d8e84efe0734-additional-scripts\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.431576 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gk2t\" (UniqueName: \"kubernetes.io/projected/c3de09b9-898d-4f1e-8de7-d8e84efe0734-kube-api-access-7gk2t\") pod \"ovn-controller-v8h85-config-8rx4b\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:20 crc kubenswrapper[4931]: I1201 15:19:20.465894 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:22 crc kubenswrapper[4931]: I1201 15:19:22.286598 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-th96h" podUID="66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: i/o timeout" Dec 01 15:19:23 crc kubenswrapper[4931]: I1201 15:19:23.459527 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:23 crc kubenswrapper[4931]: E1201 15:19:23.459876 4931 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 15:19:23 crc kubenswrapper[4931]: E1201 15:19:23.459905 4931 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 15:19:23 crc kubenswrapper[4931]: E1201 15:19:23.459969 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift podName:fe036b57-6753-42af-ad39-195f0688532d nodeName:}" failed. No retries permitted until 2025-12-01 15:19:39.459949539 +0000 UTC m=+1125.885823206 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift") pod "swift-storage-0" (UID: "fe036b57-6753-42af-ad39-195f0688532d") : configmap "swift-ring-files" not found Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.112827 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.271748 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-config\") pod \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.272035 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-ovsdbserver-nb\") pod \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.272096 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-ovsdbserver-sb\") pod \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.272116 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-dns-svc\") pod \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.272179 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqn8d\" (UniqueName: \"kubernetes.io/projected/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-kube-api-access-dqn8d\") pod \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.282841 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-kube-api-access-dqn8d" (OuterVolumeSpecName: "kube-api-access-dqn8d") pod "66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" (UID: "66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e"). InnerVolumeSpecName "kube-api-access-dqn8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.353635 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" (UID: "66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.373055 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-config" (OuterVolumeSpecName: "config") pod "66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" (UID: "66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.373494 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-config\") pod \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\" (UID: \"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e\") " Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.373933 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.373950 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqn8d\" (UniqueName: \"kubernetes.io/projected/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-kube-api-access-dqn8d\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:24 crc kubenswrapper[4931]: W1201 15:19:24.374582 4931 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e/volumes/kubernetes.io~configmap/config Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.374602 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-config" (OuterVolumeSpecName: "config") pod "66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" (UID: "66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.378890 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" (UID: "66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.381824 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" (UID: "66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.475847 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.476732 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.476838 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.586100 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v8h85-config-8rx4b"] Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.813195 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-v8h85" podUID="6f943374-baa7-4200-93ff-6773c58b032d" containerName="ovn-controller" probeResult="failure" output=< Dec 01 15:19:24 crc kubenswrapper[4931]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 15:19:24 crc kubenswrapper[4931]: > Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.909595 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lbvg4" event={"ID":"ec217c21-8698-4dd3-a58d-b2626cbfbaf2","Type":"ContainerStarted","Data":"b34cfabf0908b2355366e3f344b4b7bb32f03f23099000fbfac2fbcc996b8b64"} Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.912652 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-th96h" event={"ID":"66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e","Type":"ContainerDied","Data":"bda28af69f40fb9218833ee2c7cef5d1c70765788ea70dc31ca7675a34615c0e"} Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.912673 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-th96h" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.912700 4931 scope.go:117] "RemoveContainer" containerID="caa800ac4c84c83e9417dadc3c2b16e0455be1f8b54108ba8224c7389f988d79" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.916775 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v8h85-config-8rx4b" event={"ID":"c3de09b9-898d-4f1e-8de7-d8e84efe0734","Type":"ContainerStarted","Data":"6bbf8e217e78cea3c06a545bdebf79654405c4d14fcb17a11c668f27189b5543"} Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.916815 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v8h85-config-8rx4b" event={"ID":"c3de09b9-898d-4f1e-8de7-d8e84efe0734","Type":"ContainerStarted","Data":"645e24a2763dbfd1cb57d4a829be44d8aa53d79548e962291c95eab87b64d8d2"} Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.921129 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnxrs" event={"ID":"f23b7a9b-3694-415b-8e68-d0757a5ccc7a","Type":"ContainerStarted","Data":"9cc39002084d2cf94ab0f3ca359a9afd9479c8887132254516d171653c256b6a"} Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.930707 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-lbvg4" podStartSLOduration=2.055306416 podStartE2EDuration="13.930691315s" podCreationTimestamp="2025-12-01 15:19:11 +0000 UTC" firstStartedPulling="2025-12-01 15:19:12.241150125 +0000 UTC m=+1098.667023822" lastFinishedPulling="2025-12-01 15:19:24.116535054 +0000 UTC m=+1110.542408721" observedRunningTime="2025-12-01 15:19:24.923960836 +0000 UTC m=+1111.349834503" watchObservedRunningTime="2025-12-01 15:19:24.930691315 +0000 UTC m=+1111.356564982" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.947502 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-v8h85-config-8rx4b" podStartSLOduration=4.947484727 podStartE2EDuration="4.947484727s" podCreationTimestamp="2025-12-01 15:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:24.944274817 +0000 UTC m=+1111.370148484" watchObservedRunningTime="2025-12-01 15:19:24.947484727 +0000 UTC m=+1111.373358394" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.949326 4931 scope.go:117] "RemoveContainer" containerID="6368c91757c6f705763644e048eda3d1de319803441b16a54763e4341a079a97" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.973133 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-xnxrs" podStartSLOduration=2.527250415 podStartE2EDuration="15.973113447s" podCreationTimestamp="2025-12-01 15:19:09 +0000 UTC" firstStartedPulling="2025-12-01 15:19:10.75498325 +0000 UTC m=+1097.180856917" lastFinishedPulling="2025-12-01 15:19:24.200846282 +0000 UTC m=+1110.626719949" observedRunningTime="2025-12-01 15:19:24.969097594 +0000 UTC m=+1111.394971261" watchObservedRunningTime="2025-12-01 15:19:24.973113447 +0000 UTC m=+1111.398987114" Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.989925 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-th96h"] Dec 01 15:19:24 crc kubenswrapper[4931]: I1201 15:19:24.996261 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-th96h"] Dec 01 15:19:25 crc kubenswrapper[4931]: I1201 15:19:25.931301 4931 generic.go:334] "Generic (PLEG): container finished" podID="c3de09b9-898d-4f1e-8de7-d8e84efe0734" containerID="6bbf8e217e78cea3c06a545bdebf79654405c4d14fcb17a11c668f27189b5543" exitCode=0 Dec 01 15:19:25 crc kubenswrapper[4931]: I1201 15:19:25.931620 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v8h85-config-8rx4b" event={"ID":"c3de09b9-898d-4f1e-8de7-d8e84efe0734","Type":"ContainerDied","Data":"6bbf8e217e78cea3c06a545bdebf79654405c4d14fcb17a11c668f27189b5543"} Dec 01 15:19:26 crc kubenswrapper[4931]: I1201 15:19:26.253497 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" path="/var/lib/kubelet/pods/66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e/volumes" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.269895 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.287359 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-th96h" podUID="66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: i/o timeout" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.433134 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-log-ovn\") pod \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.433216 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c3de09b9-898d-4f1e-8de7-d8e84efe0734" (UID: "c3de09b9-898d-4f1e-8de7-d8e84efe0734"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.433301 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gk2t\" (UniqueName: \"kubernetes.io/projected/c3de09b9-898d-4f1e-8de7-d8e84efe0734-kube-api-access-7gk2t\") pod \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.433337 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3de09b9-898d-4f1e-8de7-d8e84efe0734-scripts\") pod \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.434338 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-run\") pod \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.434397 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3de09b9-898d-4f1e-8de7-d8e84efe0734-scripts" (OuterVolumeSpecName: "scripts") pod "c3de09b9-898d-4f1e-8de7-d8e84efe0734" (UID: "c3de09b9-898d-4f1e-8de7-d8e84efe0734"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.434436 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c3de09b9-898d-4f1e-8de7-d8e84efe0734" (UID: "c3de09b9-898d-4f1e-8de7-d8e84efe0734"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.434419 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-run-ovn\") pod \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.434486 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-run" (OuterVolumeSpecName: "var-run") pod "c3de09b9-898d-4f1e-8de7-d8e84efe0734" (UID: "c3de09b9-898d-4f1e-8de7-d8e84efe0734"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.434547 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3de09b9-898d-4f1e-8de7-d8e84efe0734-additional-scripts\") pod \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\" (UID: \"c3de09b9-898d-4f1e-8de7-d8e84efe0734\") " Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.435001 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3de09b9-898d-4f1e-8de7-d8e84efe0734-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c3de09b9-898d-4f1e-8de7-d8e84efe0734" (UID: "c3de09b9-898d-4f1e-8de7-d8e84efe0734"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.435724 4931 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3de09b9-898d-4f1e-8de7-d8e84efe0734-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.435749 4931 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.435757 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3de09b9-898d-4f1e-8de7-d8e84efe0734-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.435766 4931 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.435775 4931 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3de09b9-898d-4f1e-8de7-d8e84efe0734-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.438907 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3de09b9-898d-4f1e-8de7-d8e84efe0734-kube-api-access-7gk2t" (OuterVolumeSpecName: "kube-api-access-7gk2t") pod "c3de09b9-898d-4f1e-8de7-d8e84efe0734" (UID: "c3de09b9-898d-4f1e-8de7-d8e84efe0734"). InnerVolumeSpecName "kube-api-access-7gk2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.537434 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gk2t\" (UniqueName: \"kubernetes.io/projected/c3de09b9-898d-4f1e-8de7-d8e84efe0734-kube-api-access-7gk2t\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.687435 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-v8h85-config-8rx4b"] Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.695307 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-v8h85-config-8rx4b"] Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.788436 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v8h85-config-bgq98"] Dec 01 15:19:27 crc kubenswrapper[4931]: E1201 15:19:27.788791 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3de09b9-898d-4f1e-8de7-d8e84efe0734" containerName="ovn-config" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.788818 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3de09b9-898d-4f1e-8de7-d8e84efe0734" containerName="ovn-config" Dec 01 15:19:27 crc kubenswrapper[4931]: E1201 15:19:27.788838 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" containerName="init" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.788849 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" containerName="init" Dec 01 15:19:27 crc kubenswrapper[4931]: E1201 15:19:27.788886 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" containerName="dnsmasq-dns" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.788894 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" containerName="dnsmasq-dns" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.789116 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3de09b9-898d-4f1e-8de7-d8e84efe0734" containerName="ovn-config" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.789147 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c36f5d-6af4-43ec-8f4f-0b973d3c2e7e" containerName="dnsmasq-dns" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.789789 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.802145 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v8h85-config-bgq98"] Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.943737 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-run-ovn\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.943791 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4l8v\" (UniqueName: \"kubernetes.io/projected/976da16a-88c5-41ab-bac7-5fee82f269eb-kube-api-access-d4l8v\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.943827 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-log-ovn\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.943847 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/976da16a-88c5-41ab-bac7-5fee82f269eb-additional-scripts\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.943872 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/976da16a-88c5-41ab-bac7-5fee82f269eb-scripts\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.943918 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-run\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.947690 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="645e24a2763dbfd1cb57d4a829be44d8aa53d79548e962291c95eab87b64d8d2" Dec 01 15:19:27 crc kubenswrapper[4931]: I1201 15:19:27.947760 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v8h85-config-8rx4b" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.045842 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4l8v\" (UniqueName: \"kubernetes.io/projected/976da16a-88c5-41ab-bac7-5fee82f269eb-kube-api-access-d4l8v\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.045913 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-log-ovn\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.045934 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/976da16a-88c5-41ab-bac7-5fee82f269eb-additional-scripts\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.045961 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/976da16a-88c5-41ab-bac7-5fee82f269eb-scripts\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.046008 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-run\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.046086 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-run-ovn\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.046376 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-run-ovn\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.046647 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-log-ovn\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.047111 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/976da16a-88c5-41ab-bac7-5fee82f269eb-additional-scripts\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.048321 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/976da16a-88c5-41ab-bac7-5fee82f269eb-scripts\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.048410 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-run\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.064277 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4l8v\" (UniqueName: \"kubernetes.io/projected/976da16a-88c5-41ab-bac7-5fee82f269eb-kube-api-access-d4l8v\") pod \"ovn-controller-v8h85-config-bgq98\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.108348 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.261212 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3de09b9-898d-4f1e-8de7-d8e84efe0734" path="/var/lib/kubelet/pods/c3de09b9-898d-4f1e-8de7-d8e84efe0734/volumes" Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.558446 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v8h85-config-bgq98"] Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.957262 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v8h85-config-bgq98" event={"ID":"976da16a-88c5-41ab-bac7-5fee82f269eb","Type":"ContainerStarted","Data":"1e4ee656f788c135500ac63c83ccdd8364b006c0124aa97183cd21c2417e9f26"} Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.957318 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v8h85-config-bgq98" event={"ID":"976da16a-88c5-41ab-bac7-5fee82f269eb","Type":"ContainerStarted","Data":"9f2e7ba14849f89b62db130532b59aad63601b5758dcdd46642b01974c802f34"} Dec 01 15:19:28 crc kubenswrapper[4931]: I1201 15:19:28.979109 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-v8h85-config-bgq98" podStartSLOduration=1.9790897410000001 podStartE2EDuration="1.979089741s" podCreationTimestamp="2025-12-01 15:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:28.974858942 +0000 UTC m=+1115.400732619" watchObservedRunningTime="2025-12-01 15:19:28.979089741 +0000 UTC m=+1115.404963418" Dec 01 15:19:29 crc kubenswrapper[4931]: I1201 15:19:29.814242 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-v8h85" Dec 01 15:19:29 crc kubenswrapper[4931]: I1201 15:19:29.964616 4931 generic.go:334] "Generic (PLEG): container finished" podID="976da16a-88c5-41ab-bac7-5fee82f269eb" containerID="1e4ee656f788c135500ac63c83ccdd8364b006c0124aa97183cd21c2417e9f26" exitCode=0 Dec 01 15:19:29 crc kubenswrapper[4931]: I1201 15:19:29.964664 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v8h85-config-bgq98" event={"ID":"976da16a-88c5-41ab-bac7-5fee82f269eb","Type":"ContainerDied","Data":"1e4ee656f788c135500ac63c83ccdd8364b006c0124aa97183cd21c2417e9f26"} Dec 01 15:19:30 crc kubenswrapper[4931]: I1201 15:19:30.825619 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.156672 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.200408 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-r5gs2"] Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.201713 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r5gs2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.254439 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-bb76-account-create-update-2jg2w"] Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.255457 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bb76-account-create-update-2jg2w" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.261146 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.269711 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-r5gs2"] Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.293590 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53a77930-f114-45b8-9256-94d34ad92839-operator-scripts\") pod \"cinder-bb76-account-create-update-2jg2w\" (UID: \"53a77930-f114-45b8-9256-94d34ad92839\") " pod="openstack/cinder-bb76-account-create-update-2jg2w" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.293689 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wfmq\" (UniqueName: \"kubernetes.io/projected/53a77930-f114-45b8-9256-94d34ad92839-kube-api-access-7wfmq\") pod \"cinder-bb76-account-create-update-2jg2w\" (UID: \"53a77930-f114-45b8-9256-94d34ad92839\") " pod="openstack/cinder-bb76-account-create-update-2jg2w" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.293723 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhzfs\" (UniqueName: \"kubernetes.io/projected/188bed20-c6ed-4ead-a520-03ec97974362-kube-api-access-fhzfs\") pod \"cinder-db-create-r5gs2\" (UID: \"188bed20-c6ed-4ead-a520-03ec97974362\") " pod="openstack/cinder-db-create-r5gs2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.293743 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/188bed20-c6ed-4ead-a520-03ec97974362-operator-scripts\") pod \"cinder-db-create-r5gs2\" (UID: \"188bed20-c6ed-4ead-a520-03ec97974362\") " pod="openstack/cinder-db-create-r5gs2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.301461 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bb76-account-create-update-2jg2w"] Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.332361 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-czvkh"] Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.333343 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-czvkh" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.339401 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-czvkh"] Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.347902 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3065-account-create-update-vfvmz"] Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.349255 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3065-account-create-update-vfvmz" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.351781 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.353689 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3065-account-create-update-vfvmz"] Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.409600 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53a77930-f114-45b8-9256-94d34ad92839-operator-scripts\") pod \"cinder-bb76-account-create-update-2jg2w\" (UID: \"53a77930-f114-45b8-9256-94d34ad92839\") " pod="openstack/cinder-bb76-account-create-update-2jg2w" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.409672 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8pjp\" (UniqueName: \"kubernetes.io/projected/ff2a8742-bece-4c6a-9f4d-6c7d1dc97450-kube-api-access-z8pjp\") pod \"barbican-db-create-czvkh\" (UID: \"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450\") " pod="openstack/barbican-db-create-czvkh" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.409726 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wfmq\" (UniqueName: \"kubernetes.io/projected/53a77930-f114-45b8-9256-94d34ad92839-kube-api-access-7wfmq\") pod \"cinder-bb76-account-create-update-2jg2w\" (UID: \"53a77930-f114-45b8-9256-94d34ad92839\") " pod="openstack/cinder-bb76-account-create-update-2jg2w" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.409755 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhzfs\" (UniqueName: \"kubernetes.io/projected/188bed20-c6ed-4ead-a520-03ec97974362-kube-api-access-fhzfs\") pod \"cinder-db-create-r5gs2\" (UID: \"188bed20-c6ed-4ead-a520-03ec97974362\") " pod="openstack/cinder-db-create-r5gs2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.409777 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/188bed20-c6ed-4ead-a520-03ec97974362-operator-scripts\") pod \"cinder-db-create-r5gs2\" (UID: \"188bed20-c6ed-4ead-a520-03ec97974362\") " pod="openstack/cinder-db-create-r5gs2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.409812 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c273f1c9-0358-4bcc-a98d-2c3ab0cd677b-operator-scripts\") pod \"barbican-3065-account-create-update-vfvmz\" (UID: \"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b\") " pod="openstack/barbican-3065-account-create-update-vfvmz" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.409869 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqkff\" (UniqueName: \"kubernetes.io/projected/c273f1c9-0358-4bcc-a98d-2c3ab0cd677b-kube-api-access-pqkff\") pod \"barbican-3065-account-create-update-vfvmz\" (UID: \"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b\") " pod="openstack/barbican-3065-account-create-update-vfvmz" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.409888 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff2a8742-bece-4c6a-9f4d-6c7d1dc97450-operator-scripts\") pod \"barbican-db-create-czvkh\" (UID: \"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450\") " pod="openstack/barbican-db-create-czvkh" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.410736 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53a77930-f114-45b8-9256-94d34ad92839-operator-scripts\") pod \"cinder-bb76-account-create-update-2jg2w\" (UID: \"53a77930-f114-45b8-9256-94d34ad92839\") " pod="openstack/cinder-bb76-account-create-update-2jg2w" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.411027 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/188bed20-c6ed-4ead-a520-03ec97974362-operator-scripts\") pod \"cinder-db-create-r5gs2\" (UID: \"188bed20-c6ed-4ead-a520-03ec97974362\") " pod="openstack/cinder-db-create-r5gs2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.443719 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhzfs\" (UniqueName: \"kubernetes.io/projected/188bed20-c6ed-4ead-a520-03ec97974362-kube-api-access-fhzfs\") pod \"cinder-db-create-r5gs2\" (UID: \"188bed20-c6ed-4ead-a520-03ec97974362\") " pod="openstack/cinder-db-create-r5gs2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.448193 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wfmq\" (UniqueName: \"kubernetes.io/projected/53a77930-f114-45b8-9256-94d34ad92839-kube-api-access-7wfmq\") pod \"cinder-bb76-account-create-update-2jg2w\" (UID: \"53a77930-f114-45b8-9256-94d34ad92839\") " pod="openstack/cinder-bb76-account-create-update-2jg2w" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.488503 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.508916 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-8zdh2"] Dec 01 15:19:31 crc kubenswrapper[4931]: E1201 15:19:31.509402 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="976da16a-88c5-41ab-bac7-5fee82f269eb" containerName="ovn-config" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.509422 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="976da16a-88c5-41ab-bac7-5fee82f269eb" containerName="ovn-config" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.509624 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="976da16a-88c5-41ab-bac7-5fee82f269eb" containerName="ovn-config" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.510258 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8zdh2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.511777 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c273f1c9-0358-4bcc-a98d-2c3ab0cd677b-operator-scripts\") pod \"barbican-3065-account-create-update-vfvmz\" (UID: \"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b\") " pod="openstack/barbican-3065-account-create-update-vfvmz" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.511831 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqkff\" (UniqueName: \"kubernetes.io/projected/c273f1c9-0358-4bcc-a98d-2c3ab0cd677b-kube-api-access-pqkff\") pod \"barbican-3065-account-create-update-vfvmz\" (UID: \"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b\") " pod="openstack/barbican-3065-account-create-update-vfvmz" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.511855 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff2a8742-bece-4c6a-9f4d-6c7d1dc97450-operator-scripts\") pod \"barbican-db-create-czvkh\" (UID: \"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450\") " pod="openstack/barbican-db-create-czvkh" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.511909 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8pjp\" (UniqueName: \"kubernetes.io/projected/ff2a8742-bece-4c6a-9f4d-6c7d1dc97450-kube-api-access-z8pjp\") pod \"barbican-db-create-czvkh\" (UID: \"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450\") " pod="openstack/barbican-db-create-czvkh" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.514626 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c273f1c9-0358-4bcc-a98d-2c3ab0cd677b-operator-scripts\") pod \"barbican-3065-account-create-update-vfvmz\" (UID: \"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b\") " pod="openstack/barbican-3065-account-create-update-vfvmz" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.518025 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff2a8742-bece-4c6a-9f4d-6c7d1dc97450-operator-scripts\") pod \"barbican-db-create-czvkh\" (UID: \"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450\") " pod="openstack/barbican-db-create-czvkh" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.524790 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0a7a-account-create-update-9qs6p"] Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.526073 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0a7a-account-create-update-9qs6p" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.528782 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.531374 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r5gs2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.545266 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqkff\" (UniqueName: \"kubernetes.io/projected/c273f1c9-0358-4bcc-a98d-2c3ab0cd677b-kube-api-access-pqkff\") pod \"barbican-3065-account-create-update-vfvmz\" (UID: \"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b\") " pod="openstack/barbican-3065-account-create-update-vfvmz" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.548329 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8pjp\" (UniqueName: \"kubernetes.io/projected/ff2a8742-bece-4c6a-9f4d-6c7d1dc97450-kube-api-access-z8pjp\") pod \"barbican-db-create-czvkh\" (UID: \"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450\") " pod="openstack/barbican-db-create-czvkh" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.551143 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8zdh2"] Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.577674 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0a7a-account-create-update-9qs6p"] Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.590731 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bb76-account-create-update-2jg2w" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.612665 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4l8v\" (UniqueName: \"kubernetes.io/projected/976da16a-88c5-41ab-bac7-5fee82f269eb-kube-api-access-d4l8v\") pod \"976da16a-88c5-41ab-bac7-5fee82f269eb\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.612767 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/976da16a-88c5-41ab-bac7-5fee82f269eb-scripts\") pod \"976da16a-88c5-41ab-bac7-5fee82f269eb\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.612798 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-log-ovn\") pod \"976da16a-88c5-41ab-bac7-5fee82f269eb\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.612841 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-run\") pod \"976da16a-88c5-41ab-bac7-5fee82f269eb\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.612880 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-run-ovn\") pod \"976da16a-88c5-41ab-bac7-5fee82f269eb\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.612909 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/976da16a-88c5-41ab-bac7-5fee82f269eb-additional-scripts\") pod \"976da16a-88c5-41ab-bac7-5fee82f269eb\" (UID: \"976da16a-88c5-41ab-bac7-5fee82f269eb\") " Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.613159 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dmww\" (UniqueName: \"kubernetes.io/projected/d8f50443-d0a3-4976-943e-0ac0de49b9b9-kube-api-access-9dmww\") pod \"neutron-0a7a-account-create-update-9qs6p\" (UID: \"d8f50443-d0a3-4976-943e-0ac0de49b9b9\") " pod="openstack/neutron-0a7a-account-create-update-9qs6p" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.613223 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b88a61-4bfa-4704-b3ae-1f61c9b65488-operator-scripts\") pod \"neutron-db-create-8zdh2\" (UID: \"23b88a61-4bfa-4704-b3ae-1f61c9b65488\") " pod="openstack/neutron-db-create-8zdh2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.613255 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8f50443-d0a3-4976-943e-0ac0de49b9b9-operator-scripts\") pod \"neutron-0a7a-account-create-update-9qs6p\" (UID: \"d8f50443-d0a3-4976-943e-0ac0de49b9b9\") " pod="openstack/neutron-0a7a-account-create-update-9qs6p" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.613285 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hrfm\" (UniqueName: \"kubernetes.io/projected/23b88a61-4bfa-4704-b3ae-1f61c9b65488-kube-api-access-4hrfm\") pod \"neutron-db-create-8zdh2\" (UID: \"23b88a61-4bfa-4704-b3ae-1f61c9b65488\") " pod="openstack/neutron-db-create-8zdh2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.615472 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "976da16a-88c5-41ab-bac7-5fee82f269eb" (UID: "976da16a-88c5-41ab-bac7-5fee82f269eb"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.616407 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/976da16a-88c5-41ab-bac7-5fee82f269eb-scripts" (OuterVolumeSpecName: "scripts") pod "976da16a-88c5-41ab-bac7-5fee82f269eb" (UID: "976da16a-88c5-41ab-bac7-5fee82f269eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.616433 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-run" (OuterVolumeSpecName: "var-run") pod "976da16a-88c5-41ab-bac7-5fee82f269eb" (UID: "976da16a-88c5-41ab-bac7-5fee82f269eb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.616490 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "976da16a-88c5-41ab-bac7-5fee82f269eb" (UID: "976da16a-88c5-41ab-bac7-5fee82f269eb"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.625600 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/976da16a-88c5-41ab-bac7-5fee82f269eb-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "976da16a-88c5-41ab-bac7-5fee82f269eb" (UID: "976da16a-88c5-41ab-bac7-5fee82f269eb"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.655362 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-dnbwr"] Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.662727 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dnbwr" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.675116 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w5fnq" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.675310 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.675467 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.675846 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.676214 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-czvkh" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.678360 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dnbwr"] Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.687194 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3065-account-create-update-vfvmz" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.688617 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/976da16a-88c5-41ab-bac7-5fee82f269eb-kube-api-access-d4l8v" (OuterVolumeSpecName: "kube-api-access-d4l8v") pod "976da16a-88c5-41ab-bac7-5fee82f269eb" (UID: "976da16a-88c5-41ab-bac7-5fee82f269eb"). InnerVolumeSpecName "kube-api-access-d4l8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.715260 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeec95d4-0a22-40e6-b1ca-b15703d71b47-config-data\") pod \"keystone-db-sync-dnbwr\" (UID: \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\") " pod="openstack/keystone-db-sync-dnbwr" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.715335 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dmww\" (UniqueName: \"kubernetes.io/projected/d8f50443-d0a3-4976-943e-0ac0de49b9b9-kube-api-access-9dmww\") pod \"neutron-0a7a-account-create-update-9qs6p\" (UID: \"d8f50443-d0a3-4976-943e-0ac0de49b9b9\") " pod="openstack/neutron-0a7a-account-create-update-9qs6p" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.715357 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvglr\" (UniqueName: \"kubernetes.io/projected/aeec95d4-0a22-40e6-b1ca-b15703d71b47-kube-api-access-pvglr\") pod \"keystone-db-sync-dnbwr\" (UID: \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\") " pod="openstack/keystone-db-sync-dnbwr" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.715433 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b88a61-4bfa-4704-b3ae-1f61c9b65488-operator-scripts\") pod \"neutron-db-create-8zdh2\" (UID: \"23b88a61-4bfa-4704-b3ae-1f61c9b65488\") " pod="openstack/neutron-db-create-8zdh2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.715458 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeec95d4-0a22-40e6-b1ca-b15703d71b47-combined-ca-bundle\") pod \"keystone-db-sync-dnbwr\" (UID: \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\") " pod="openstack/keystone-db-sync-dnbwr" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.715477 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8f50443-d0a3-4976-943e-0ac0de49b9b9-operator-scripts\") pod \"neutron-0a7a-account-create-update-9qs6p\" (UID: \"d8f50443-d0a3-4976-943e-0ac0de49b9b9\") " pod="openstack/neutron-0a7a-account-create-update-9qs6p" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.715510 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hrfm\" (UniqueName: \"kubernetes.io/projected/23b88a61-4bfa-4704-b3ae-1f61c9b65488-kube-api-access-4hrfm\") pod \"neutron-db-create-8zdh2\" (UID: \"23b88a61-4bfa-4704-b3ae-1f61c9b65488\") " pod="openstack/neutron-db-create-8zdh2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.715554 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4l8v\" (UniqueName: \"kubernetes.io/projected/976da16a-88c5-41ab-bac7-5fee82f269eb-kube-api-access-d4l8v\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.715564 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/976da16a-88c5-41ab-bac7-5fee82f269eb-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.715627 4931 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.715665 4931 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.715679 4931 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/976da16a-88c5-41ab-bac7-5fee82f269eb-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.715693 4931 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/976da16a-88c5-41ab-bac7-5fee82f269eb-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.716236 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b88a61-4bfa-4704-b3ae-1f61c9b65488-operator-scripts\") pod \"neutron-db-create-8zdh2\" (UID: \"23b88a61-4bfa-4704-b3ae-1f61c9b65488\") " pod="openstack/neutron-db-create-8zdh2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.716530 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8f50443-d0a3-4976-943e-0ac0de49b9b9-operator-scripts\") pod \"neutron-0a7a-account-create-update-9qs6p\" (UID: \"d8f50443-d0a3-4976-943e-0ac0de49b9b9\") " pod="openstack/neutron-0a7a-account-create-update-9qs6p" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.741069 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dmww\" (UniqueName: \"kubernetes.io/projected/d8f50443-d0a3-4976-943e-0ac0de49b9b9-kube-api-access-9dmww\") pod \"neutron-0a7a-account-create-update-9qs6p\" (UID: \"d8f50443-d0a3-4976-943e-0ac0de49b9b9\") " pod="openstack/neutron-0a7a-account-create-update-9qs6p" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.772949 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hrfm\" (UniqueName: \"kubernetes.io/projected/23b88a61-4bfa-4704-b3ae-1f61c9b65488-kube-api-access-4hrfm\") pod \"neutron-db-create-8zdh2\" (UID: \"23b88a61-4bfa-4704-b3ae-1f61c9b65488\") " pod="openstack/neutron-db-create-8zdh2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.817183 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeec95d4-0a22-40e6-b1ca-b15703d71b47-combined-ca-bundle\") pod \"keystone-db-sync-dnbwr\" (UID: \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\") " pod="openstack/keystone-db-sync-dnbwr" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.817590 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeec95d4-0a22-40e6-b1ca-b15703d71b47-config-data\") pod \"keystone-db-sync-dnbwr\" (UID: \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\") " pod="openstack/keystone-db-sync-dnbwr" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.817634 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvglr\" (UniqueName: \"kubernetes.io/projected/aeec95d4-0a22-40e6-b1ca-b15703d71b47-kube-api-access-pvglr\") pod \"keystone-db-sync-dnbwr\" (UID: \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\") " pod="openstack/keystone-db-sync-dnbwr" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.821287 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeec95d4-0a22-40e6-b1ca-b15703d71b47-combined-ca-bundle\") pod \"keystone-db-sync-dnbwr\" (UID: \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\") " pod="openstack/keystone-db-sync-dnbwr" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.826026 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeec95d4-0a22-40e6-b1ca-b15703d71b47-config-data\") pod \"keystone-db-sync-dnbwr\" (UID: \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\") " pod="openstack/keystone-db-sync-dnbwr" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.855796 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8zdh2" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.897316 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvglr\" (UniqueName: \"kubernetes.io/projected/aeec95d4-0a22-40e6-b1ca-b15703d71b47-kube-api-access-pvglr\") pod \"keystone-db-sync-dnbwr\" (UID: \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\") " pod="openstack/keystone-db-sync-dnbwr" Dec 01 15:19:31 crc kubenswrapper[4931]: I1201 15:19:31.913991 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0a7a-account-create-update-9qs6p" Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.004156 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dnbwr" Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.005238 4931 generic.go:334] "Generic (PLEG): container finished" podID="ec217c21-8698-4dd3-a58d-b2626cbfbaf2" containerID="b34cfabf0908b2355366e3f344b4b7bb32f03f23099000fbfac2fbcc996b8b64" exitCode=0 Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.005317 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lbvg4" event={"ID":"ec217c21-8698-4dd3-a58d-b2626cbfbaf2","Type":"ContainerDied","Data":"b34cfabf0908b2355366e3f344b4b7bb32f03f23099000fbfac2fbcc996b8b64"} Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.009848 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v8h85-config-bgq98" event={"ID":"976da16a-88c5-41ab-bac7-5fee82f269eb","Type":"ContainerDied","Data":"9f2e7ba14849f89b62db130532b59aad63601b5758dcdd46642b01974c802f34"} Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.009880 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f2e7ba14849f89b62db130532b59aad63601b5758dcdd46642b01974c802f34" Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.009932 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v8h85-config-bgq98" Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.145803 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-v8h85-config-bgq98"] Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.166244 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-v8h85-config-bgq98"] Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.268453 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="976da16a-88c5-41ab-bac7-5fee82f269eb" path="/var/lib/kubelet/pods/976da16a-88c5-41ab-bac7-5fee82f269eb/volumes" Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.272656 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-r5gs2"] Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.499006 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bb76-account-create-update-2jg2w"] Dec 01 15:19:32 crc kubenswrapper[4931]: W1201 15:19:32.502540 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53a77930_f114_45b8_9256_94d34ad92839.slice/crio-29a58690b9c4f43f066e2142868967e12ce8873e63a9a59ff416292d796b418c WatchSource:0}: Error finding container 29a58690b9c4f43f066e2142868967e12ce8873e63a9a59ff416292d796b418c: Status 404 returned error can't find the container with id 29a58690b9c4f43f066e2142868967e12ce8873e63a9a59ff416292d796b418c Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.604491 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-czvkh"] Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.616203 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3065-account-create-update-vfvmz"] Dec 01 15:19:32 crc kubenswrapper[4931]: W1201 15:19:32.698629 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23b88a61_4bfa_4704_b3ae_1f61c9b65488.slice/crio-2066abb705e74ac100d00db134eb94f0b7e2b54f66bb05fed8f4298062f86370 WatchSource:0}: Error finding container 2066abb705e74ac100d00db134eb94f0b7e2b54f66bb05fed8f4298062f86370: Status 404 returned error can't find the container with id 2066abb705e74ac100d00db134eb94f0b7e2b54f66bb05fed8f4298062f86370 Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.700027 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8zdh2"] Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.718657 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dnbwr"] Dec 01 15:19:32 crc kubenswrapper[4931]: W1201 15:19:32.736593 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeec95d4_0a22_40e6_b1ca_b15703d71b47.slice/crio-3e20b803556bb98fc2544b94bfc5b52e6b26e904973667e903c862212551b7da WatchSource:0}: Error finding container 3e20b803556bb98fc2544b94bfc5b52e6b26e904973667e903c862212551b7da: Status 404 returned error can't find the container with id 3e20b803556bb98fc2544b94bfc5b52e6b26e904973667e903c862212551b7da Dec 01 15:19:32 crc kubenswrapper[4931]: I1201 15:19:32.742498 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0a7a-account-create-update-9qs6p"] Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.021775 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8zdh2" event={"ID":"23b88a61-4bfa-4704-b3ae-1f61c9b65488","Type":"ContainerStarted","Data":"36e2ad7b2c9c6bb49ffeb3b09196062a70b6c92c956052032a50f8e33d3593a4"} Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.022064 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8zdh2" event={"ID":"23b88a61-4bfa-4704-b3ae-1f61c9b65488","Type":"ContainerStarted","Data":"2066abb705e74ac100d00db134eb94f0b7e2b54f66bb05fed8f4298062f86370"} Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.024126 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3065-account-create-update-vfvmz" event={"ID":"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b","Type":"ContainerStarted","Data":"7aa32bfcb52781fe815cf9f97649051984403a6c884e00198060ca109fee8b7b"} Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.024183 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3065-account-create-update-vfvmz" event={"ID":"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b","Type":"ContainerStarted","Data":"2a5b076a9f036aa5b847128ca22480c312704313e0458e6673c86ee38551ed84"} Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.025849 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dnbwr" event={"ID":"aeec95d4-0a22-40e6-b1ca-b15703d71b47","Type":"ContainerStarted","Data":"3e20b803556bb98fc2544b94bfc5b52e6b26e904973667e903c862212551b7da"} Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.027202 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0a7a-account-create-update-9qs6p" event={"ID":"d8f50443-d0a3-4976-943e-0ac0de49b9b9","Type":"ContainerStarted","Data":"393e97e8f435b50a3f01a9d0061e12c6f41976aefe289a2909ec415ee54a8664"} Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.027232 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0a7a-account-create-update-9qs6p" event={"ID":"d8f50443-d0a3-4976-943e-0ac0de49b9b9","Type":"ContainerStarted","Data":"13011334bdf1e6143c73e3a67840ab7ef54f5d3eac764ed1b6e03fd74738639f"} Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.034641 4931 generic.go:334] "Generic (PLEG): container finished" podID="188bed20-c6ed-4ead-a520-03ec97974362" containerID="a863911a6a782d716dfc46306a44f46f24cdc14d169cac5410a982b5443dc918" exitCode=0 Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.034751 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r5gs2" event={"ID":"188bed20-c6ed-4ead-a520-03ec97974362","Type":"ContainerDied","Data":"a863911a6a782d716dfc46306a44f46f24cdc14d169cac5410a982b5443dc918"} Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.034970 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r5gs2" event={"ID":"188bed20-c6ed-4ead-a520-03ec97974362","Type":"ContainerStarted","Data":"f4cdfa4ae83905acb1619faa30c3c1f9aaa5c8d154869badb140c6efd989378e"} Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.042838 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bb76-account-create-update-2jg2w" event={"ID":"53a77930-f114-45b8-9256-94d34ad92839","Type":"ContainerStarted","Data":"3d42c6bf96381e4cee89b6dc4c52cac055b381c0c484c3f3bcea642acc9ae693"} Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.042889 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bb76-account-create-update-2jg2w" event={"ID":"53a77930-f114-45b8-9256-94d34ad92839","Type":"ContainerStarted","Data":"29a58690b9c4f43f066e2142868967e12ce8873e63a9a59ff416292d796b418c"} Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.045878 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-8zdh2" podStartSLOduration=2.045856734 podStartE2EDuration="2.045856734s" podCreationTimestamp="2025-12-01 15:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:33.044624639 +0000 UTC m=+1119.470498296" watchObservedRunningTime="2025-12-01 15:19:33.045856734 +0000 UTC m=+1119.471730401" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.047996 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-czvkh" event={"ID":"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450","Type":"ContainerStarted","Data":"474562a1b39cf4e490f292ca2d98e8481d1c879f2289066e262720282c3de466"} Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.048062 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-czvkh" event={"ID":"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450","Type":"ContainerStarted","Data":"c034559eebd075f1b8b4f44f7b359e76151ec0f6683af10395fce9dcff0a0e37"} Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.098431 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-3065-account-create-update-vfvmz" podStartSLOduration=2.09840454 podStartE2EDuration="2.09840454s" podCreationTimestamp="2025-12-01 15:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:33.084365556 +0000 UTC m=+1119.510239223" watchObservedRunningTime="2025-12-01 15:19:33.09840454 +0000 UTC m=+1119.524278227" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.107897 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-bb76-account-create-update-2jg2w" podStartSLOduration=2.107880856 podStartE2EDuration="2.107880856s" podCreationTimestamp="2025-12-01 15:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:33.097014451 +0000 UTC m=+1119.522888118" watchObservedRunningTime="2025-12-01 15:19:33.107880856 +0000 UTC m=+1119.533754513" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.121600 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-0a7a-account-create-update-9qs6p" podStartSLOduration=2.121579981 podStartE2EDuration="2.121579981s" podCreationTimestamp="2025-12-01 15:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:33.118278428 +0000 UTC m=+1119.544152095" watchObservedRunningTime="2025-12-01 15:19:33.121579981 +0000 UTC m=+1119.547453648" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.407501 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.434018 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-czvkh" podStartSLOduration=2.433998518 podStartE2EDuration="2.433998518s" podCreationTimestamp="2025-12-01 15:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:33.179899579 +0000 UTC m=+1119.605773246" watchObservedRunningTime="2025-12-01 15:19:33.433998518 +0000 UTC m=+1119.859872205" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.464920 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-swiftconf\") pod \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.464992 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-combined-ca-bundle\") pod \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.465023 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-scripts\") pod \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.465112 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-ring-data-devices\") pod \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.465149 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbwzs\" (UniqueName: \"kubernetes.io/projected/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-kube-api-access-tbwzs\") pod \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.465182 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-dispersionconf\") pod \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.465210 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-etc-swift\") pod \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\" (UID: \"ec217c21-8698-4dd3-a58d-b2626cbfbaf2\") " Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.465909 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ec217c21-8698-4dd3-a58d-b2626cbfbaf2" (UID: "ec217c21-8698-4dd3-a58d-b2626cbfbaf2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.466312 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ec217c21-8698-4dd3-a58d-b2626cbfbaf2" (UID: "ec217c21-8698-4dd3-a58d-b2626cbfbaf2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.477040 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-kube-api-access-tbwzs" (OuterVolumeSpecName: "kube-api-access-tbwzs") pod "ec217c21-8698-4dd3-a58d-b2626cbfbaf2" (UID: "ec217c21-8698-4dd3-a58d-b2626cbfbaf2"). InnerVolumeSpecName "kube-api-access-tbwzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.496933 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ec217c21-8698-4dd3-a58d-b2626cbfbaf2" (UID: "ec217c21-8698-4dd3-a58d-b2626cbfbaf2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.499969 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-scripts" (OuterVolumeSpecName: "scripts") pod "ec217c21-8698-4dd3-a58d-b2626cbfbaf2" (UID: "ec217c21-8698-4dd3-a58d-b2626cbfbaf2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.502027 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec217c21-8698-4dd3-a58d-b2626cbfbaf2" (UID: "ec217c21-8698-4dd3-a58d-b2626cbfbaf2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.507572 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ec217c21-8698-4dd3-a58d-b2626cbfbaf2" (UID: "ec217c21-8698-4dd3-a58d-b2626cbfbaf2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.567531 4931 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.567569 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.567580 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.567589 4931 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.567600 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbwzs\" (UniqueName: \"kubernetes.io/projected/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-kube-api-access-tbwzs\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.567611 4931 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:33 crc kubenswrapper[4931]: I1201 15:19:33.567619 4931 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec217c21-8698-4dd3-a58d-b2626cbfbaf2-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.059244 4931 generic.go:334] "Generic (PLEG): container finished" podID="23b88a61-4bfa-4704-b3ae-1f61c9b65488" containerID="36e2ad7b2c9c6bb49ffeb3b09196062a70b6c92c956052032a50f8e33d3593a4" exitCode=0 Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.059325 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8zdh2" event={"ID":"23b88a61-4bfa-4704-b3ae-1f61c9b65488","Type":"ContainerDied","Data":"36e2ad7b2c9c6bb49ffeb3b09196062a70b6c92c956052032a50f8e33d3593a4"} Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.060812 4931 generic.go:334] "Generic (PLEG): container finished" podID="c273f1c9-0358-4bcc-a98d-2c3ab0cd677b" containerID="7aa32bfcb52781fe815cf9f97649051984403a6c884e00198060ca109fee8b7b" exitCode=0 Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.060867 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3065-account-create-update-vfvmz" event={"ID":"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b","Type":"ContainerDied","Data":"7aa32bfcb52781fe815cf9f97649051984403a6c884e00198060ca109fee8b7b"} Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.062546 4931 generic.go:334] "Generic (PLEG): container finished" podID="f23b7a9b-3694-415b-8e68-d0757a5ccc7a" containerID="9cc39002084d2cf94ab0f3ca359a9afd9479c8887132254516d171653c256b6a" exitCode=0 Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.062649 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnxrs" event={"ID":"f23b7a9b-3694-415b-8e68-d0757a5ccc7a","Type":"ContainerDied","Data":"9cc39002084d2cf94ab0f3ca359a9afd9479c8887132254516d171653c256b6a"} Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.067368 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lbvg4" event={"ID":"ec217c21-8698-4dd3-a58d-b2626cbfbaf2","Type":"ContainerDied","Data":"b6e88870ba4cbe8547807308025faf9ed80eb64e92cd44a8a7c0d1d503641103"} Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.067409 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6e88870ba4cbe8547807308025faf9ed80eb64e92cd44a8a7c0d1d503641103" Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.067434 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lbvg4" Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.069341 4931 generic.go:334] "Generic (PLEG): container finished" podID="d8f50443-d0a3-4976-943e-0ac0de49b9b9" containerID="393e97e8f435b50a3f01a9d0061e12c6f41976aefe289a2909ec415ee54a8664" exitCode=0 Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.069401 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0a7a-account-create-update-9qs6p" event={"ID":"d8f50443-d0a3-4976-943e-0ac0de49b9b9","Type":"ContainerDied","Data":"393e97e8f435b50a3f01a9d0061e12c6f41976aefe289a2909ec415ee54a8664"} Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.070799 4931 generic.go:334] "Generic (PLEG): container finished" podID="53a77930-f114-45b8-9256-94d34ad92839" containerID="3d42c6bf96381e4cee89b6dc4c52cac055b381c0c484c3f3bcea642acc9ae693" exitCode=0 Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.070850 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bb76-account-create-update-2jg2w" event={"ID":"53a77930-f114-45b8-9256-94d34ad92839","Type":"ContainerDied","Data":"3d42c6bf96381e4cee89b6dc4c52cac055b381c0c484c3f3bcea642acc9ae693"} Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.072019 4931 generic.go:334] "Generic (PLEG): container finished" podID="ff2a8742-bece-4c6a-9f4d-6c7d1dc97450" containerID="474562a1b39cf4e490f292ca2d98e8481d1c879f2289066e262720282c3de466" exitCode=0 Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.072064 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-czvkh" event={"ID":"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450","Type":"ContainerDied","Data":"474562a1b39cf4e490f292ca2d98e8481d1c879f2289066e262720282c3de466"} Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.443514 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r5gs2" Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.483210 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhzfs\" (UniqueName: \"kubernetes.io/projected/188bed20-c6ed-4ead-a520-03ec97974362-kube-api-access-fhzfs\") pod \"188bed20-c6ed-4ead-a520-03ec97974362\" (UID: \"188bed20-c6ed-4ead-a520-03ec97974362\") " Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.483329 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/188bed20-c6ed-4ead-a520-03ec97974362-operator-scripts\") pod \"188bed20-c6ed-4ead-a520-03ec97974362\" (UID: \"188bed20-c6ed-4ead-a520-03ec97974362\") " Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.484237 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/188bed20-c6ed-4ead-a520-03ec97974362-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "188bed20-c6ed-4ead-a520-03ec97974362" (UID: "188bed20-c6ed-4ead-a520-03ec97974362"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.491508 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/188bed20-c6ed-4ead-a520-03ec97974362-kube-api-access-fhzfs" (OuterVolumeSpecName: "kube-api-access-fhzfs") pod "188bed20-c6ed-4ead-a520-03ec97974362" (UID: "188bed20-c6ed-4ead-a520-03ec97974362"). InnerVolumeSpecName "kube-api-access-fhzfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.584964 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/188bed20-c6ed-4ead-a520-03ec97974362-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:34 crc kubenswrapper[4931]: I1201 15:19:34.585000 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhzfs\" (UniqueName: \"kubernetes.io/projected/188bed20-c6ed-4ead-a520-03ec97974362-kube-api-access-fhzfs\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:35 crc kubenswrapper[4931]: I1201 15:19:35.080619 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-r5gs2" event={"ID":"188bed20-c6ed-4ead-a520-03ec97974362","Type":"ContainerDied","Data":"f4cdfa4ae83905acb1619faa30c3c1f9aaa5c8d154869badb140c6efd989378e"} Dec 01 15:19:35 crc kubenswrapper[4931]: I1201 15:19:35.080985 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4cdfa4ae83905acb1619faa30c3c1f9aaa5c8d154869badb140c6efd989378e" Dec 01 15:19:35 crc kubenswrapper[4931]: I1201 15:19:35.080694 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-r5gs2" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.112894 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-czvkh" event={"ID":"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450","Type":"ContainerDied","Data":"c034559eebd075f1b8b4f44f7b359e76151ec0f6683af10395fce9dcff0a0e37"} Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.113280 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c034559eebd075f1b8b4f44f7b359e76151ec0f6683af10395fce9dcff0a0e37" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.116099 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8zdh2" event={"ID":"23b88a61-4bfa-4704-b3ae-1f61c9b65488","Type":"ContainerDied","Data":"2066abb705e74ac100d00db134eb94f0b7e2b54f66bb05fed8f4298062f86370"} Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.116146 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2066abb705e74ac100d00db134eb94f0b7e2b54f66bb05fed8f4298062f86370" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.116182 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-czvkh" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.117871 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3065-account-create-update-vfvmz" event={"ID":"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b","Type":"ContainerDied","Data":"2a5b076a9f036aa5b847128ca22480c312704313e0458e6673c86ee38551ed84"} Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.117895 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a5b076a9f036aa5b847128ca22480c312704313e0458e6673c86ee38551ed84" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.120340 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnxrs" event={"ID":"f23b7a9b-3694-415b-8e68-d0757a5ccc7a","Type":"ContainerDied","Data":"3622988efc42f0d8cc146cd05fbcfc951d48dfec73167b903a424808e31218b9"} Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.120372 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3622988efc42f0d8cc146cd05fbcfc951d48dfec73167b903a424808e31218b9" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.121618 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0a7a-account-create-update-9qs6p" event={"ID":"d8f50443-d0a3-4976-943e-0ac0de49b9b9","Type":"ContainerDied","Data":"13011334bdf1e6143c73e3a67840ab7ef54f5d3eac764ed1b6e03fd74738639f"} Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.121638 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13011334bdf1e6143c73e3a67840ab7ef54f5d3eac764ed1b6e03fd74738639f" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.122948 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bb76-account-create-update-2jg2w" event={"ID":"53a77930-f114-45b8-9256-94d34ad92839","Type":"ContainerDied","Data":"29a58690b9c4f43f066e2142868967e12ce8873e63a9a59ff416292d796b418c"} Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.122975 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29a58690b9c4f43f066e2142868967e12ce8873e63a9a59ff416292d796b418c" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.159106 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff2a8742-bece-4c6a-9f4d-6c7d1dc97450-operator-scripts\") pod \"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450\" (UID: \"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.159201 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8pjp\" (UniqueName: \"kubernetes.io/projected/ff2a8742-bece-4c6a-9f4d-6c7d1dc97450-kube-api-access-z8pjp\") pod \"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450\" (UID: \"ff2a8742-bece-4c6a-9f4d-6c7d1dc97450\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.159971 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2a8742-bece-4c6a-9f4d-6c7d1dc97450-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff2a8742-bece-4c6a-9f4d-6c7d1dc97450" (UID: "ff2a8742-bece-4c6a-9f4d-6c7d1dc97450"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.165631 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff2a8742-bece-4c6a-9f4d-6c7d1dc97450-kube-api-access-z8pjp" (OuterVolumeSpecName: "kube-api-access-z8pjp") pod "ff2a8742-bece-4c6a-9f4d-6c7d1dc97450" (UID: "ff2a8742-bece-4c6a-9f4d-6c7d1dc97450"). InnerVolumeSpecName "kube-api-access-z8pjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.246143 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.251023 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0a7a-account-create-update-9qs6p" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.260366 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bb76-account-create-update-2jg2w" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.262014 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8pjp\" (UniqueName: \"kubernetes.io/projected/ff2a8742-bece-4c6a-9f4d-6c7d1dc97450-kube-api-access-z8pjp\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.262042 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff2a8742-bece-4c6a-9f4d-6c7d1dc97450-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.293020 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3065-account-create-update-vfvmz" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.315718 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8zdh2" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363033 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b88a61-4bfa-4704-b3ae-1f61c9b65488-operator-scripts\") pod \"23b88a61-4bfa-4704-b3ae-1f61c9b65488\" (UID: \"23b88a61-4bfa-4704-b3ae-1f61c9b65488\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363084 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-db-sync-config-data\") pod \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363169 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hrfm\" (UniqueName: \"kubernetes.io/projected/23b88a61-4bfa-4704-b3ae-1f61c9b65488-kube-api-access-4hrfm\") pod \"23b88a61-4bfa-4704-b3ae-1f61c9b65488\" (UID: \"23b88a61-4bfa-4704-b3ae-1f61c9b65488\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363196 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wfmq\" (UniqueName: \"kubernetes.io/projected/53a77930-f114-45b8-9256-94d34ad92839-kube-api-access-7wfmq\") pod \"53a77930-f114-45b8-9256-94d34ad92839\" (UID: \"53a77930-f114-45b8-9256-94d34ad92839\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363221 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dmww\" (UniqueName: \"kubernetes.io/projected/d8f50443-d0a3-4976-943e-0ac0de49b9b9-kube-api-access-9dmww\") pod \"d8f50443-d0a3-4976-943e-0ac0de49b9b9\" (UID: \"d8f50443-d0a3-4976-943e-0ac0de49b9b9\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363247 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8f50443-d0a3-4976-943e-0ac0de49b9b9-operator-scripts\") pod \"d8f50443-d0a3-4976-943e-0ac0de49b9b9\" (UID: \"d8f50443-d0a3-4976-943e-0ac0de49b9b9\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363324 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53a77930-f114-45b8-9256-94d34ad92839-operator-scripts\") pod \"53a77930-f114-45b8-9256-94d34ad92839\" (UID: \"53a77930-f114-45b8-9256-94d34ad92839\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363347 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c273f1c9-0358-4bcc-a98d-2c3ab0cd677b-operator-scripts\") pod \"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b\" (UID: \"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363374 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqkff\" (UniqueName: \"kubernetes.io/projected/c273f1c9-0358-4bcc-a98d-2c3ab0cd677b-kube-api-access-pqkff\") pod \"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b\" (UID: \"c273f1c9-0358-4bcc-a98d-2c3ab0cd677b\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363403 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srrp8\" (UniqueName: \"kubernetes.io/projected/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-kube-api-access-srrp8\") pod \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363443 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-config-data\") pod \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363464 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-combined-ca-bundle\") pod \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\" (UID: \"f23b7a9b-3694-415b-8e68-d0757a5ccc7a\") " Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363564 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b88a61-4bfa-4704-b3ae-1f61c9b65488-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23b88a61-4bfa-4704-b3ae-1f61c9b65488" (UID: "23b88a61-4bfa-4704-b3ae-1f61c9b65488"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.363920 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53a77930-f114-45b8-9256-94d34ad92839-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53a77930-f114-45b8-9256-94d34ad92839" (UID: "53a77930-f114-45b8-9256-94d34ad92839"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.364035 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c273f1c9-0358-4bcc-a98d-2c3ab0cd677b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c273f1c9-0358-4bcc-a98d-2c3ab0cd677b" (UID: "c273f1c9-0358-4bcc-a98d-2c3ab0cd677b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.364482 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f50443-d0a3-4976-943e-0ac0de49b9b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8f50443-d0a3-4976-943e-0ac0de49b9b9" (UID: "d8f50443-d0a3-4976-943e-0ac0de49b9b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.364861 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8f50443-d0a3-4976-943e-0ac0de49b9b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.364883 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53a77930-f114-45b8-9256-94d34ad92839-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.364892 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c273f1c9-0358-4bcc-a98d-2c3ab0cd677b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.364900 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b88a61-4bfa-4704-b3ae-1f61c9b65488-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.367608 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c273f1c9-0358-4bcc-a98d-2c3ab0cd677b-kube-api-access-pqkff" (OuterVolumeSpecName: "kube-api-access-pqkff") pod "c273f1c9-0358-4bcc-a98d-2c3ab0cd677b" (UID: "c273f1c9-0358-4bcc-a98d-2c3ab0cd677b"). InnerVolumeSpecName "kube-api-access-pqkff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.367647 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f23b7a9b-3694-415b-8e68-d0757a5ccc7a" (UID: "f23b7a9b-3694-415b-8e68-d0757a5ccc7a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.368586 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a77930-f114-45b8-9256-94d34ad92839-kube-api-access-7wfmq" (OuterVolumeSpecName: "kube-api-access-7wfmq") pod "53a77930-f114-45b8-9256-94d34ad92839" (UID: "53a77930-f114-45b8-9256-94d34ad92839"). InnerVolumeSpecName "kube-api-access-7wfmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.369767 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b88a61-4bfa-4704-b3ae-1f61c9b65488-kube-api-access-4hrfm" (OuterVolumeSpecName: "kube-api-access-4hrfm") pod "23b88a61-4bfa-4704-b3ae-1f61c9b65488" (UID: "23b88a61-4bfa-4704-b3ae-1f61c9b65488"). InnerVolumeSpecName "kube-api-access-4hrfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.370237 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-kube-api-access-srrp8" (OuterVolumeSpecName: "kube-api-access-srrp8") pod "f23b7a9b-3694-415b-8e68-d0757a5ccc7a" (UID: "f23b7a9b-3694-415b-8e68-d0757a5ccc7a"). InnerVolumeSpecName "kube-api-access-srrp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.370280 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f50443-d0a3-4976-943e-0ac0de49b9b9-kube-api-access-9dmww" (OuterVolumeSpecName: "kube-api-access-9dmww") pod "d8f50443-d0a3-4976-943e-0ac0de49b9b9" (UID: "d8f50443-d0a3-4976-943e-0ac0de49b9b9"). InnerVolumeSpecName "kube-api-access-9dmww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.385158 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f23b7a9b-3694-415b-8e68-d0757a5ccc7a" (UID: "f23b7a9b-3694-415b-8e68-d0757a5ccc7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.405865 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-config-data" (OuterVolumeSpecName: "config-data") pod "f23b7a9b-3694-415b-8e68-d0757a5ccc7a" (UID: "f23b7a9b-3694-415b-8e68-d0757a5ccc7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.467488 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqkff\" (UniqueName: \"kubernetes.io/projected/c273f1c9-0358-4bcc-a98d-2c3ab0cd677b-kube-api-access-pqkff\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.467529 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srrp8\" (UniqueName: \"kubernetes.io/projected/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-kube-api-access-srrp8\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.467540 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.467554 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.467566 4931 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f23b7a9b-3694-415b-8e68-d0757a5ccc7a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.467577 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hrfm\" (UniqueName: \"kubernetes.io/projected/23b88a61-4bfa-4704-b3ae-1f61c9b65488-kube-api-access-4hrfm\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.467589 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wfmq\" (UniqueName: \"kubernetes.io/projected/53a77930-f114-45b8-9256-94d34ad92839-kube-api-access-7wfmq\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:38 crc kubenswrapper[4931]: I1201 15:19:38.467600 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dmww\" (UniqueName: \"kubernetes.io/projected/d8f50443-d0a3-4976-943e-0ac0de49b9b9-kube-api-access-9dmww\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.134811 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0a7a-account-create-update-9qs6p" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.146830 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dnbwr" event={"ID":"aeec95d4-0a22-40e6-b1ca-b15703d71b47","Type":"ContainerStarted","Data":"e0c0899b3dd8a7aeb7ca184ab3e643ef350bb7efa3a2bdd04d57cf96832c9f2f"} Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.146976 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-czvkh" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.147008 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnxrs" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.147738 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bb76-account-create-update-2jg2w" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.149241 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3065-account-create-update-vfvmz" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.149784 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8zdh2" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.194413 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-dnbwr" podStartSLOduration=2.93244346 podStartE2EDuration="8.194362455s" podCreationTimestamp="2025-12-01 15:19:31 +0000 UTC" firstStartedPulling="2025-12-01 15:19:32.764623724 +0000 UTC m=+1119.190497391" lastFinishedPulling="2025-12-01 15:19:38.026542719 +0000 UTC m=+1124.452416386" observedRunningTime="2025-12-01 15:19:39.170424043 +0000 UTC m=+1125.596297730" watchObservedRunningTime="2025-12-01 15:19:39.194362455 +0000 UTC m=+1125.620236132" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.490754 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.515124 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe036b57-6753-42af-ad39-195f0688532d-etc-swift\") pod \"swift-storage-0\" (UID: \"fe036b57-6753-42af-ad39-195f0688532d\") " pod="openstack/swift-storage-0" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.605975 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633253 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-flztw"] Dec 01 15:19:39 crc kubenswrapper[4931]: E1201 15:19:39.633586 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2a8742-bece-4c6a-9f4d-6c7d1dc97450" containerName="mariadb-database-create" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633603 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2a8742-bece-4c6a-9f4d-6c7d1dc97450" containerName="mariadb-database-create" Dec 01 15:19:39 crc kubenswrapper[4931]: E1201 15:19:39.633616 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f50443-d0a3-4976-943e-0ac0de49b9b9" containerName="mariadb-account-create-update" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633623 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f50443-d0a3-4976-943e-0ac0de49b9b9" containerName="mariadb-account-create-update" Dec 01 15:19:39 crc kubenswrapper[4931]: E1201 15:19:39.633641 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188bed20-c6ed-4ead-a520-03ec97974362" containerName="mariadb-database-create" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633648 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="188bed20-c6ed-4ead-a520-03ec97974362" containerName="mariadb-database-create" Dec 01 15:19:39 crc kubenswrapper[4931]: E1201 15:19:39.633659 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec217c21-8698-4dd3-a58d-b2626cbfbaf2" containerName="swift-ring-rebalance" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633665 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec217c21-8698-4dd3-a58d-b2626cbfbaf2" containerName="swift-ring-rebalance" Dec 01 15:19:39 crc kubenswrapper[4931]: E1201 15:19:39.633675 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23b7a9b-3694-415b-8e68-d0757a5ccc7a" containerName="glance-db-sync" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633681 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23b7a9b-3694-415b-8e68-d0757a5ccc7a" containerName="glance-db-sync" Dec 01 15:19:39 crc kubenswrapper[4931]: E1201 15:19:39.633690 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b88a61-4bfa-4704-b3ae-1f61c9b65488" containerName="mariadb-database-create" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633695 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b88a61-4bfa-4704-b3ae-1f61c9b65488" containerName="mariadb-database-create" Dec 01 15:19:39 crc kubenswrapper[4931]: E1201 15:19:39.633708 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c273f1c9-0358-4bcc-a98d-2c3ab0cd677b" containerName="mariadb-account-create-update" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633717 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c273f1c9-0358-4bcc-a98d-2c3ab0cd677b" containerName="mariadb-account-create-update" Dec 01 15:19:39 crc kubenswrapper[4931]: E1201 15:19:39.633730 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a77930-f114-45b8-9256-94d34ad92839" containerName="mariadb-account-create-update" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633736 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a77930-f114-45b8-9256-94d34ad92839" containerName="mariadb-account-create-update" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633877 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c273f1c9-0358-4bcc-a98d-2c3ab0cd677b" containerName="mariadb-account-create-update" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633887 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff2a8742-bece-4c6a-9f4d-6c7d1dc97450" containerName="mariadb-database-create" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633900 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f23b7a9b-3694-415b-8e68-d0757a5ccc7a" containerName="glance-db-sync" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633911 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f50443-d0a3-4976-943e-0ac0de49b9b9" containerName="mariadb-account-create-update" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633920 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="188bed20-c6ed-4ead-a520-03ec97974362" containerName="mariadb-database-create" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633926 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b88a61-4bfa-4704-b3ae-1f61c9b65488" containerName="mariadb-database-create" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633934 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec217c21-8698-4dd3-a58d-b2626cbfbaf2" containerName="swift-ring-rebalance" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.633943 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a77930-f114-45b8-9256-94d34ad92839" containerName="mariadb-account-create-update" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.634742 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.695422 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.696055 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-dns-svc\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.696171 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-config\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.696321 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.696578 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2ltm\" (UniqueName: \"kubernetes.io/projected/df709027-75b2-45a3-a038-08931bbdc703-kube-api-access-j2ltm\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.702993 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-flztw"] Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.798521 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2ltm\" (UniqueName: \"kubernetes.io/projected/df709027-75b2-45a3-a038-08931bbdc703-kube-api-access-j2ltm\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.798583 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.798615 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-dns-svc\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.798664 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-config\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.798701 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.799898 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.799906 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.799981 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-dns-svc\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.800504 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-config\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:39 crc kubenswrapper[4931]: I1201 15:19:39.866709 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2ltm\" (UniqueName: \"kubernetes.io/projected/df709027-75b2-45a3-a038-08931bbdc703-kube-api-access-j2ltm\") pod \"dnsmasq-dns-74dc88fc-flztw\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:40 crc kubenswrapper[4931]: I1201 15:19:40.014948 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:40 crc kubenswrapper[4931]: W1201 15:19:40.289087 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe036b57_6753_42af_ad39_195f0688532d.slice/crio-316c847763251cd53e6e2e02c32d1e978adc6c9af83ad11d0987a2902a95f3da WatchSource:0}: Error finding container 316c847763251cd53e6e2e02c32d1e978adc6c9af83ad11d0987a2902a95f3da: Status 404 returned error can't find the container with id 316c847763251cd53e6e2e02c32d1e978adc6c9af83ad11d0987a2902a95f3da Dec 01 15:19:40 crc kubenswrapper[4931]: I1201 15:19:40.312717 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 01 15:19:40 crc kubenswrapper[4931]: I1201 15:19:40.349926 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-flztw"] Dec 01 15:19:41 crc kubenswrapper[4931]: I1201 15:19:41.149109 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"316c847763251cd53e6e2e02c32d1e978adc6c9af83ad11d0987a2902a95f3da"} Dec 01 15:19:41 crc kubenswrapper[4931]: I1201 15:19:41.151068 4931 generic.go:334] "Generic (PLEG): container finished" podID="df709027-75b2-45a3-a038-08931bbdc703" containerID="cfca232bc10f0d831f2e153a96c1aaba1c09a3b4dc28fb5b6c187dc2376fdc64" exitCode=0 Dec 01 15:19:41 crc kubenswrapper[4931]: I1201 15:19:41.151123 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-flztw" event={"ID":"df709027-75b2-45a3-a038-08931bbdc703","Type":"ContainerDied","Data":"cfca232bc10f0d831f2e153a96c1aaba1c09a3b4dc28fb5b6c187dc2376fdc64"} Dec 01 15:19:41 crc kubenswrapper[4931]: I1201 15:19:41.151153 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-flztw" event={"ID":"df709027-75b2-45a3-a038-08931bbdc703","Type":"ContainerStarted","Data":"dca066c0336338634012ee7a5bc8e0484b15a1ad335caac9d51c904870015072"} Dec 01 15:19:42 crc kubenswrapper[4931]: I1201 15:19:42.163298 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-flztw" event={"ID":"df709027-75b2-45a3-a038-08931bbdc703","Type":"ContainerStarted","Data":"0faf22106e8c73826a44ea93389ffa98f0b56d2509170c108da2899582ee0845"} Dec 01 15:19:42 crc kubenswrapper[4931]: I1201 15:19:42.164030 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:42 crc kubenswrapper[4931]: I1201 15:19:42.165120 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"9b403bb22da1fd6fda8fc3593991c015259c28fbfa4b40f3842806272143bb54"} Dec 01 15:19:42 crc kubenswrapper[4931]: I1201 15:19:42.183367 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74dc88fc-flztw" podStartSLOduration=3.183350331 podStartE2EDuration="3.183350331s" podCreationTimestamp="2025-12-01 15:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:42.181372526 +0000 UTC m=+1128.607246203" watchObservedRunningTime="2025-12-01 15:19:42.183350331 +0000 UTC m=+1128.609224018" Dec 01 15:19:43 crc kubenswrapper[4931]: I1201 15:19:43.176614 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"7d12e2119fddf49d4f9e5f4f6a251b76319927c5365d31c4adf8947a6dd564a9"} Dec 01 15:19:43 crc kubenswrapper[4931]: I1201 15:19:43.176934 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"fd40a7f029893d1a2a6bf29e3c584d29584f7ae26908817ebf15865779fca03f"} Dec 01 15:19:43 crc kubenswrapper[4931]: I1201 15:19:43.176958 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"88b33c29e2dd6524a2388252e7324d01d602ec2cd7da87b48a9d73e188005b4b"} Dec 01 15:19:44 crc kubenswrapper[4931]: I1201 15:19:44.189130 4931 generic.go:334] "Generic (PLEG): container finished" podID="aeec95d4-0a22-40e6-b1ca-b15703d71b47" containerID="e0c0899b3dd8a7aeb7ca184ab3e643ef350bb7efa3a2bdd04d57cf96832c9f2f" exitCode=0 Dec 01 15:19:44 crc kubenswrapper[4931]: I1201 15:19:44.189185 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dnbwr" event={"ID":"aeec95d4-0a22-40e6-b1ca-b15703d71b47","Type":"ContainerDied","Data":"e0c0899b3dd8a7aeb7ca184ab3e643ef350bb7efa3a2bdd04d57cf96832c9f2f"} Dec 01 15:19:45 crc kubenswrapper[4931]: I1201 15:19:45.220108 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"92a4d78fb7a30fddc82412637523c262ec26d67ae3725050c13a028e0fe22ca9"} Dec 01 15:19:45 crc kubenswrapper[4931]: I1201 15:19:45.221098 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"992c3c75bf71ccf2da21365b30f581480e1ea131d9f01082b32d891c7eca1353"} Dec 01 15:19:45 crc kubenswrapper[4931]: I1201 15:19:45.544309 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dnbwr" Dec 01 15:19:45 crc kubenswrapper[4931]: I1201 15:19:45.608898 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeec95d4-0a22-40e6-b1ca-b15703d71b47-config-data\") pod \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\" (UID: \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\") " Dec 01 15:19:45 crc kubenswrapper[4931]: I1201 15:19:45.609054 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvglr\" (UniqueName: \"kubernetes.io/projected/aeec95d4-0a22-40e6-b1ca-b15703d71b47-kube-api-access-pvglr\") pod \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\" (UID: \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\") " Dec 01 15:19:45 crc kubenswrapper[4931]: I1201 15:19:45.609184 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeec95d4-0a22-40e6-b1ca-b15703d71b47-combined-ca-bundle\") pod \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\" (UID: \"aeec95d4-0a22-40e6-b1ca-b15703d71b47\") " Dec 01 15:19:45 crc kubenswrapper[4931]: I1201 15:19:45.617183 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeec95d4-0a22-40e6-b1ca-b15703d71b47-kube-api-access-pvglr" (OuterVolumeSpecName: "kube-api-access-pvglr") pod "aeec95d4-0a22-40e6-b1ca-b15703d71b47" (UID: "aeec95d4-0a22-40e6-b1ca-b15703d71b47"). InnerVolumeSpecName "kube-api-access-pvglr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:45 crc kubenswrapper[4931]: I1201 15:19:45.638205 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeec95d4-0a22-40e6-b1ca-b15703d71b47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeec95d4-0a22-40e6-b1ca-b15703d71b47" (UID: "aeec95d4-0a22-40e6-b1ca-b15703d71b47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:19:45 crc kubenswrapper[4931]: I1201 15:19:45.653424 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeec95d4-0a22-40e6-b1ca-b15703d71b47-config-data" (OuterVolumeSpecName: "config-data") pod "aeec95d4-0a22-40e6-b1ca-b15703d71b47" (UID: "aeec95d4-0a22-40e6-b1ca-b15703d71b47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:19:45 crc kubenswrapper[4931]: I1201 15:19:45.711028 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvglr\" (UniqueName: \"kubernetes.io/projected/aeec95d4-0a22-40e6-b1ca-b15703d71b47-kube-api-access-pvglr\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:45 crc kubenswrapper[4931]: I1201 15:19:45.711066 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeec95d4-0a22-40e6-b1ca-b15703d71b47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:45 crc kubenswrapper[4931]: I1201 15:19:45.711076 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeec95d4-0a22-40e6-b1ca-b15703d71b47-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.232039 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"4e7f92069bb839c2111b4dadda195b61a6f1629019f5a9d6af6003e696026f4c"} Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.233672 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dnbwr" event={"ID":"aeec95d4-0a22-40e6-b1ca-b15703d71b47","Type":"ContainerDied","Data":"3e20b803556bb98fc2544b94bfc5b52e6b26e904973667e903c862212551b7da"} Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.233700 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e20b803556bb98fc2544b94bfc5b52e6b26e904973667e903c862212551b7da" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.233741 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dnbwr" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.465579 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-flztw"] Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.466093 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74dc88fc-flztw" podUID="df709027-75b2-45a3-a038-08931bbdc703" containerName="dnsmasq-dns" containerID="cri-o://0faf22106e8c73826a44ea93389ffa98f0b56d2509170c108da2899582ee0845" gracePeriod=10 Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.467845 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.495452 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-k4w5f"] Dec 01 15:19:46 crc kubenswrapper[4931]: E1201 15:19:46.495945 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeec95d4-0a22-40e6-b1ca-b15703d71b47" containerName="keystone-db-sync" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.495970 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeec95d4-0a22-40e6-b1ca-b15703d71b47" containerName="keystone-db-sync" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.496173 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeec95d4-0a22-40e6-b1ca-b15703d71b47" containerName="keystone-db-sync" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.496893 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.498814 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.499495 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.499692 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.499916 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.503101 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w5fnq" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.519711 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d5679f497-2bbm9"] Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.521441 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.530138 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5679f497-2bbm9"] Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.546257 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k4w5f"] Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.628181 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-config-data\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.628241 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.628278 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-credential-keys\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.628316 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-dns-svc\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.628333 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfkqj\" (UniqueName: \"kubernetes.io/projected/65035c1b-9a6b-466e-91cc-a603c7b56ea5-kube-api-access-nfkqj\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.628353 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-combined-ca-bundle\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.628370 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.628410 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-fernet-keys\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.628428 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-scripts\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.628455 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnjjs\" (UniqueName: \"kubernetes.io/projected/acebf535-ac84-4676-adc3-cc9638561419-kube-api-access-wnjjs\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.628476 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-config\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.693671 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-588bb94b89-fmwv2"] Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.703939 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.708066 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.708250 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.708278 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.714942 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-z7nrt" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.724809 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-588bb94b89-fmwv2"] Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.730303 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfkqj\" (UniqueName: \"kubernetes.io/projected/65035c1b-9a6b-466e-91cc-a603c7b56ea5-kube-api-access-nfkqj\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.730345 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-dns-svc\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.730371 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-combined-ca-bundle\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.730409 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.731687 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-fernet-keys\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.731755 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-scripts\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.731826 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnjjs\" (UniqueName: \"kubernetes.io/projected/acebf535-ac84-4676-adc3-cc9638561419-kube-api-access-wnjjs\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.731860 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-config\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.731978 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-config-data\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.732040 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.732077 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-credential-keys\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.735086 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-dns-svc\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.741241 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.741544 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-config\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.746198 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-combined-ca-bundle\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.748197 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-credential-keys\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.748487 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-scripts\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.749301 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-config-data\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.760774 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.779199 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-fernet-keys\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.818493 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnjjs\" (UniqueName: \"kubernetes.io/projected/acebf535-ac84-4676-adc3-cc9638561419-kube-api-access-wnjjs\") pod \"dnsmasq-dns-7d5679f497-2bbm9\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.832509 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfkqj\" (UniqueName: \"kubernetes.io/projected/65035c1b-9a6b-466e-91cc-a603c7b56ea5-kube-api-access-nfkqj\") pod \"keystone-bootstrap-k4w5f\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.833799 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f96c96c0-f0e9-45f6-82db-984fb2aec67f-logs\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.833840 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f96c96c0-f0e9-45f6-82db-984fb2aec67f-scripts\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.833881 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcb26\" (UniqueName: \"kubernetes.io/projected/f96c96c0-f0e9-45f6-82db-984fb2aec67f-kube-api-access-vcb26\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.833918 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f96c96c0-f0e9-45f6-82db-984fb2aec67f-config-data\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.833984 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f96c96c0-f0e9-45f6-82db-984fb2aec67f-horizon-secret-key\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.840735 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.852456 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-cmc5n"] Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.853807 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-cmc5n"] Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.853895 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.866508 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5w9qw" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.866508 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.866866 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.941257 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f96c96c0-f0e9-45f6-82db-984fb2aec67f-logs\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.941304 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f96c96c0-f0e9-45f6-82db-984fb2aec67f-scripts\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.941325 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-scripts\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.941344 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-config-data\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.941365 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcb26\" (UniqueName: \"kubernetes.io/projected/f96c96c0-f0e9-45f6-82db-984fb2aec67f-kube-api-access-vcb26\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.941402 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/658da6f1-ac70-4d83-83ca-f79e69f0979d-etc-machine-id\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.941427 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f96c96c0-f0e9-45f6-82db-984fb2aec67f-config-data\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.941474 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f96c96c0-f0e9-45f6-82db-984fb2aec67f-horizon-secret-key\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.941491 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h69rh\" (UniqueName: \"kubernetes.io/projected/658da6f1-ac70-4d83-83ca-f79e69f0979d-kube-api-access-h69rh\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.941509 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-combined-ca-bundle\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.941538 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-db-sync-config-data\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.941906 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f96c96c0-f0e9-45f6-82db-984fb2aec67f-logs\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.942889 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f96c96c0-f0e9-45f6-82db-984fb2aec67f-scripts\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.942922 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f96c96c0-f0e9-45f6-82db-984fb2aec67f-config-data\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.955780 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f96c96c0-f0e9-45f6-82db-984fb2aec67f-horizon-secret-key\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.979613 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.981495 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.995200 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 15:19:46 crc kubenswrapper[4931]: I1201 15:19:46.995547 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.051699 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcb26\" (UniqueName: \"kubernetes.io/projected/f96c96c0-f0e9-45f6-82db-984fb2aec67f-kube-api-access-vcb26\") pod \"horizon-588bb94b89-fmwv2\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.059421 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-scripts\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.059530 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-config-data\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.059592 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/658da6f1-ac70-4d83-83ca-f79e69f0979d-etc-machine-id\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.059742 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h69rh\" (UniqueName: \"kubernetes.io/projected/658da6f1-ac70-4d83-83ca-f79e69f0979d-kube-api-access-h69rh\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.059763 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-combined-ca-bundle\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.059814 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-db-sync-config-data\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.060046 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/658da6f1-ac70-4d83-83ca-f79e69f0979d-etc-machine-id\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.065949 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-scripts\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.066037 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5679f497-2bbm9"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.077185 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-combined-ca-bundle\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.078038 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-db-sync-config-data\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.086698 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h69rh\" (UniqueName: \"kubernetes.io/projected/658da6f1-ac70-4d83-83ca-f79e69f0979d-kube-api-access-h69rh\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.098632 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-config-data\") pod \"cinder-db-sync-cmc5n\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.115632 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.115948 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.144884 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-qsr4v"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.146267 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qsr4v" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.159895 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.160789 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1dea62-ecc2-401d-b436-acc91fba2d5d-log-httpd\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.160854 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.160872 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-config-data\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.160897 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-scripts\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.160920 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.160954 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wtrw\" (UniqueName: \"kubernetes.io/projected/6e1dea62-ecc2-401d-b436-acc91fba2d5d-kube-api-access-2wtrw\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.160979 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1dea62-ecc2-401d-b436-acc91fba2d5d-run-httpd\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.184911 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rtw29" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.185167 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.212659 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56798b757f-6q6f9"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.220817 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.236460 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-9qzq8"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.254282 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-qsr4v"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.261456 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ffphc"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.262577 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.265477 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9qzq8" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.265757 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wtrw\" (UniqueName: \"kubernetes.io/projected/6e1dea62-ecc2-401d-b436-acc91fba2d5d-kube-api-access-2wtrw\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.265853 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1dea62-ecc2-401d-b436-acc91fba2d5d-run-httpd\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.266128 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrdxb\" (UniqueName: \"kubernetes.io/projected/a79a9139-dcea-4b3f-83dc-a1715f087ac5-kube-api-access-rrdxb\") pod \"barbican-db-sync-qsr4v\" (UID: \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\") " pod="openstack/barbican-db-sync-qsr4v" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.266252 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1dea62-ecc2-401d-b436-acc91fba2d5d-log-httpd\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.266324 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79a9139-dcea-4b3f-83dc-a1715f087ac5-db-sync-config-data\") pod \"barbican-db-sync-qsr4v\" (UID: \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\") " pod="openstack/barbican-db-sync-qsr4v" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.266459 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-config-data\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.266540 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.266622 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79a9139-dcea-4b3f-83dc-a1715f087ac5-combined-ca-bundle\") pod \"barbican-db-sync-qsr4v\" (UID: \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\") " pod="openstack/barbican-db-sync-qsr4v" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.266702 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-scripts\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.266807 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.267918 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1dea62-ecc2-401d-b436-acc91fba2d5d-run-httpd\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.268304 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1dea62-ecc2-401d-b436-acc91fba2d5d-log-httpd\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.270675 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.270879 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.275765 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.276006 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-x78jr" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.278291 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.280798 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-bcrdg" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.284563 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9qzq8"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.306993 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.307519 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-config-data\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.307886 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.308434 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-scripts\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.311000 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wtrw\" (UniqueName: \"kubernetes.io/projected/6e1dea62-ecc2-401d-b436-acc91fba2d5d-kube-api-access-2wtrw\") pod \"ceilometer-0\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.312771 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-6q6f9"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.333687 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ffphc"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.340962 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.362440 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68669fb557-g4zdb"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.364312 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.378900 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19efc2e2-7d7e-455c-b966-525f140a2dad-logs\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.378964 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9ada3f9-8074-48c1-a190-d6535e26e14f-config\") pod \"neutron-db-sync-9qzq8\" (UID: \"f9ada3f9-8074-48c1-a190-d6535e26e14f\") " pod="openstack/neutron-db-sync-9qzq8" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.378996 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-ovsdbserver-nb\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.379026 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-dns-svc\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.379065 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrdxb\" (UniqueName: \"kubernetes.io/projected/a79a9139-dcea-4b3f-83dc-a1715f087ac5-kube-api-access-rrdxb\") pod \"barbican-db-sync-qsr4v\" (UID: \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\") " pod="openstack/barbican-db-sync-qsr4v" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.379084 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-config\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.379112 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjxkw\" (UniqueName: \"kubernetes.io/projected/38f515fe-6925-463c-b5dc-87b23d360ec5-kube-api-access-xjxkw\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.379808 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79a9139-dcea-4b3f-83dc-a1715f087ac5-db-sync-config-data\") pod \"barbican-db-sync-qsr4v\" (UID: \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\") " pod="openstack/barbican-db-sync-qsr4v" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.379918 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-combined-ca-bundle\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.384913 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68669fb557-g4zdb"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.388526 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-ovsdbserver-sb\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.388617 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19efc2e2-7d7e-455c-b966-525f140a2dad-scripts\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.388686 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kr7x\" (UniqueName: \"kubernetes.io/projected/f9ada3f9-8074-48c1-a190-d6535e26e14f-kube-api-access-5kr7x\") pod \"neutron-db-sync-9qzq8\" (UID: \"f9ada3f9-8074-48c1-a190-d6535e26e14f\") " pod="openstack/neutron-db-sync-9qzq8" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.388734 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-scripts\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.388790 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh4td\" (UniqueName: \"kubernetes.io/projected/99c8e9db-f582-408e-a3ba-a0b583c3218c-kube-api-access-bh4td\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.388812 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38f515fe-6925-463c-b5dc-87b23d360ec5-logs\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.388892 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79a9139-dcea-4b3f-83dc-a1715f087ac5-combined-ca-bundle\") pod \"barbican-db-sync-qsr4v\" (UID: \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\") " pod="openstack/barbican-db-sync-qsr4v" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.388974 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ada3f9-8074-48c1-a190-d6535e26e14f-combined-ca-bundle\") pod \"neutron-db-sync-9qzq8\" (UID: \"f9ada3f9-8074-48c1-a190-d6535e26e14f\") " pod="openstack/neutron-db-sync-9qzq8" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.389000 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd2rd\" (UniqueName: \"kubernetes.io/projected/19efc2e2-7d7e-455c-b966-525f140a2dad-kube-api-access-jd2rd\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.389040 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19efc2e2-7d7e-455c-b966-525f140a2dad-horizon-secret-key\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.389070 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19efc2e2-7d7e-455c-b966-525f140a2dad-config-data\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.389106 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-config-data\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.400097 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79a9139-dcea-4b3f-83dc-a1715f087ac5-db-sync-config-data\") pod \"barbican-db-sync-qsr4v\" (UID: \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\") " pod="openstack/barbican-db-sync-qsr4v" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.401334 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79a9139-dcea-4b3f-83dc-a1715f087ac5-combined-ca-bundle\") pod \"barbican-db-sync-qsr4v\" (UID: \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\") " pod="openstack/barbican-db-sync-qsr4v" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.414204 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrdxb\" (UniqueName: \"kubernetes.io/projected/a79a9139-dcea-4b3f-83dc-a1715f087ac5-kube-api-access-rrdxb\") pod \"barbican-db-sync-qsr4v\" (UID: \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\") " pod="openstack/barbican-db-sync-qsr4v" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.446892 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.457263 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.459107 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.466073 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.466294 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.470541 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6wcpc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.480633 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.481764 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.490613 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-config\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.490692 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjxkw\" (UniqueName: \"kubernetes.io/projected/38f515fe-6925-463c-b5dc-87b23d360ec5-kube-api-access-xjxkw\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.490989 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-combined-ca-bundle\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.491019 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-ovsdbserver-sb\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.491051 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19efc2e2-7d7e-455c-b966-525f140a2dad-scripts\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.491910 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-config\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.492911 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-ovsdbserver-sb\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.495671 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19efc2e2-7d7e-455c-b966-525f140a2dad-scripts\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.495851 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kr7x\" (UniqueName: \"kubernetes.io/projected/f9ada3f9-8074-48c1-a190-d6535e26e14f-kube-api-access-5kr7x\") pod \"neutron-db-sync-9qzq8\" (UID: \"f9ada3f9-8074-48c1-a190-d6535e26e14f\") " pod="openstack/neutron-db-sync-9qzq8" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.495902 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-scripts\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.495970 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh4td\" (UniqueName: \"kubernetes.io/projected/99c8e9db-f582-408e-a3ba-a0b583c3218c-kube-api-access-bh4td\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.495991 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38f515fe-6925-463c-b5dc-87b23d360ec5-logs\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.496100 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ada3f9-8074-48c1-a190-d6535e26e14f-combined-ca-bundle\") pod \"neutron-db-sync-9qzq8\" (UID: \"f9ada3f9-8074-48c1-a190-d6535e26e14f\") " pod="openstack/neutron-db-sync-9qzq8" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.496126 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd2rd\" (UniqueName: \"kubernetes.io/projected/19efc2e2-7d7e-455c-b966-525f140a2dad-kube-api-access-jd2rd\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.496160 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19efc2e2-7d7e-455c-b966-525f140a2dad-horizon-secret-key\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.496183 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19efc2e2-7d7e-455c-b966-525f140a2dad-config-data\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.496224 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-config-data\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.496297 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19efc2e2-7d7e-455c-b966-525f140a2dad-logs\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.496322 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9ada3f9-8074-48c1-a190-d6535e26e14f-config\") pod \"neutron-db-sync-9qzq8\" (UID: \"f9ada3f9-8074-48c1-a190-d6535e26e14f\") " pod="openstack/neutron-db-sync-9qzq8" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.496351 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-ovsdbserver-nb\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.496397 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-dns-svc\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.497145 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-dns-svc\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.499540 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38f515fe-6925-463c-b5dc-87b23d360ec5-logs\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.501861 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-ovsdbserver-nb\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.502283 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19efc2e2-7d7e-455c-b966-525f140a2dad-logs\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.503321 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19efc2e2-7d7e-455c-b966-525f140a2dad-config-data\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.505431 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-combined-ca-bundle\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.516028 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.518058 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.522001 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ada3f9-8074-48c1-a190-d6535e26e14f-combined-ca-bundle\") pod \"neutron-db-sync-9qzq8\" (UID: \"f9ada3f9-8074-48c1-a190-d6535e26e14f\") " pod="openstack/neutron-db-sync-9qzq8" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.523894 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.524139 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.526614 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9ada3f9-8074-48c1-a190-d6535e26e14f-config\") pod \"neutron-db-sync-9qzq8\" (UID: \"f9ada3f9-8074-48c1-a190-d6535e26e14f\") " pod="openstack/neutron-db-sync-9qzq8" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.528915 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.533595 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjxkw\" (UniqueName: \"kubernetes.io/projected/38f515fe-6925-463c-b5dc-87b23d360ec5-kube-api-access-xjxkw\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.533832 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh4td\" (UniqueName: \"kubernetes.io/projected/99c8e9db-f582-408e-a3ba-a0b583c3218c-kube-api-access-bh4td\") pod \"dnsmasq-dns-56798b757f-6q6f9\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.539616 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-scripts\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.543766 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19efc2e2-7d7e-455c-b966-525f140a2dad-horizon-secret-key\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.542461 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-config-data\") pod \"placement-db-sync-ffphc\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.551028 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kr7x\" (UniqueName: \"kubernetes.io/projected/f9ada3f9-8074-48c1-a190-d6535e26e14f-kube-api-access-5kr7x\") pod \"neutron-db-sync-9qzq8\" (UID: \"f9ada3f9-8074-48c1-a190-d6535e26e14f\") " pod="openstack/neutron-db-sync-9qzq8" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.551820 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd2rd\" (UniqueName: \"kubernetes.io/projected/19efc2e2-7d7e-455c-b966-525f140a2dad-kube-api-access-jd2rd\") pod \"horizon-68669fb557-g4zdb\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599436 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599489 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-scripts\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599518 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/174578a2-050b-44a8-9f8c-75d918589c78-logs\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599542 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c07769f8-f78c-4248-b590-f916b5362cd3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599562 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-config-data\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599621 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599651 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c07769f8-f78c-4248-b590-f916b5362cd3-logs\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599675 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nccg8\" (UniqueName: \"kubernetes.io/projected/c07769f8-f78c-4248-b590-f916b5362cd3-kube-api-access-nccg8\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599767 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/174578a2-050b-44a8-9f8c-75d918589c78-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599801 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599826 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599851 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-scripts\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599887 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599906 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-config-data\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599922 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.599940 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l6q6\" (UniqueName: \"kubernetes.io/projected/174578a2-050b-44a8-9f8c-75d918589c78-kube-api-access-4l6q6\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.615479 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qsr4v" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.637772 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.673299 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ffphc" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702189 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/174578a2-050b-44a8-9f8c-75d918589c78-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702233 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702253 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702274 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-scripts\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702305 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702320 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-config-data\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702333 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702350 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l6q6\" (UniqueName: \"kubernetes.io/projected/174578a2-050b-44a8-9f8c-75d918589c78-kube-api-access-4l6q6\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702368 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702414 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-scripts\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702441 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/174578a2-050b-44a8-9f8c-75d918589c78-logs\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702459 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c07769f8-f78c-4248-b590-f916b5362cd3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702480 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-config-data\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702522 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702546 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c07769f8-f78c-4248-b590-f916b5362cd3-logs\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.702564 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nccg8\" (UniqueName: \"kubernetes.io/projected/c07769f8-f78c-4248-b590-f916b5362cd3-kube-api-access-nccg8\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.703232 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/174578a2-050b-44a8-9f8c-75d918589c78-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.703619 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.703728 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.704444 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c07769f8-f78c-4248-b590-f916b5362cd3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.704789 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/174578a2-050b-44a8-9f8c-75d918589c78-logs\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.707289 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c07769f8-f78c-4248-b590-f916b5362cd3-logs\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.708854 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-config-data\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.709694 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9qzq8" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.709847 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.712013 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-scripts\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.718873 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.718942 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.720628 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.721131 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-scripts\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.723045 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nccg8\" (UniqueName: \"kubernetes.io/projected/c07769f8-f78c-4248-b590-f916b5362cd3-kube-api-access-nccg8\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.728730 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l6q6\" (UniqueName: \"kubernetes.io/projected/174578a2-050b-44a8-9f8c-75d918589c78-kube-api-access-4l6q6\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.733737 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-config-data\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.748108 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.748681 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.790759 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.794731 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5679f497-2bbm9"] Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.876111 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.889803 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 15:19:47 crc kubenswrapper[4931]: I1201 15:19:47.910846 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-cmc5n"] Dec 01 15:19:48 crc kubenswrapper[4931]: I1201 15:19:48.101480 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k4w5f"] Dec 01 15:19:48 crc kubenswrapper[4931]: I1201 15:19:48.302799 4931 generic.go:334] "Generic (PLEG): container finished" podID="df709027-75b2-45a3-a038-08931bbdc703" containerID="0faf22106e8c73826a44ea93389ffa98f0b56d2509170c108da2899582ee0845" exitCode=0 Dec 01 15:19:48 crc kubenswrapper[4931]: I1201 15:19:48.302848 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-flztw" event={"ID":"df709027-75b2-45a3-a038-08931bbdc703","Type":"ContainerDied","Data":"0faf22106e8c73826a44ea93389ffa98f0b56d2509170c108da2899582ee0845"} Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.400762 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.471327 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-588bb94b89-fmwv2"] Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.489044 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.505056 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.517338 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-bd7477fc7-x2lt5"] Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.526086 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.538323 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bd7477fc7-x2lt5"] Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.638770 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1330f183-8546-45e4-8ad8-9ec6eb7affa6-horizon-secret-key\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.639014 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1330f183-8546-45e4-8ad8-9ec6eb7affa6-config-data\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.639059 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1330f183-8546-45e4-8ad8-9ec6eb7affa6-logs\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.639082 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bsfz\" (UniqueName: \"kubernetes.io/projected/1330f183-8546-45e4-8ad8-9ec6eb7affa6-kube-api-access-4bsfz\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.639179 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1330f183-8546-45e4-8ad8-9ec6eb7affa6-scripts\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.740279 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1330f183-8546-45e4-8ad8-9ec6eb7affa6-scripts\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.740443 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1330f183-8546-45e4-8ad8-9ec6eb7affa6-horizon-secret-key\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.740469 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1330f183-8546-45e4-8ad8-9ec6eb7affa6-config-data\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.740521 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1330f183-8546-45e4-8ad8-9ec6eb7affa6-logs\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.740552 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bsfz\" (UniqueName: \"kubernetes.io/projected/1330f183-8546-45e4-8ad8-9ec6eb7affa6-kube-api-access-4bsfz\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.740851 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1330f183-8546-45e4-8ad8-9ec6eb7affa6-logs\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.741131 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1330f183-8546-45e4-8ad8-9ec6eb7affa6-scripts\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.741863 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1330f183-8546-45e4-8ad8-9ec6eb7affa6-config-data\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.748359 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1330f183-8546-45e4-8ad8-9ec6eb7affa6-horizon-secret-key\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.754891 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bsfz\" (UniqueName: \"kubernetes.io/projected/1330f183-8546-45e4-8ad8-9ec6eb7affa6-kube-api-access-4bsfz\") pod \"horizon-bd7477fc7-x2lt5\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:49 crc kubenswrapper[4931]: I1201 15:19:49.848656 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:19:50 crc kubenswrapper[4931]: I1201 15:19:50.015907 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dc88fc-flztw" podUID="df709027-75b2-45a3-a038-08931bbdc703" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Dec 01 15:19:50 crc kubenswrapper[4931]: W1201 15:19:50.066456 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacebf535_ac84_4676_adc3_cc9638561419.slice/crio-1ed0a7f2c6f90a19ed11b2627211dbd8c52706007429fe1335c7150e2f3a6107 WatchSource:0}: Error finding container 1ed0a7f2c6f90a19ed11b2627211dbd8c52706007429fe1335c7150e2f3a6107: Status 404 returned error can't find the container with id 1ed0a7f2c6f90a19ed11b2627211dbd8c52706007429fe1335c7150e2f3a6107 Dec 01 15:19:50 crc kubenswrapper[4931]: W1201 15:19:50.094208 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod658da6f1_ac70_4d83_83ca_f79e69f0979d.slice/crio-5adc16cb91c1386b583c653b6632be1423676e99f137a3df706caa3fdad7854f WatchSource:0}: Error finding container 5adc16cb91c1386b583c653b6632be1423676e99f137a3df706caa3fdad7854f: Status 404 returned error can't find the container with id 5adc16cb91c1386b583c653b6632be1423676e99f137a3df706caa3fdad7854f Dec 01 15:19:50 crc kubenswrapper[4931]: I1201 15:19:50.322230 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" event={"ID":"acebf535-ac84-4676-adc3-cc9638561419","Type":"ContainerStarted","Data":"1ed0a7f2c6f90a19ed11b2627211dbd8c52706007429fe1335c7150e2f3a6107"} Dec 01 15:19:50 crc kubenswrapper[4931]: I1201 15:19:50.338357 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"1bf1e9128d37524b53b39bb288c0a9bbbffba98f690fde869457b3d1030b66df"} Dec 01 15:19:50 crc kubenswrapper[4931]: I1201 15:19:50.344140 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k4w5f" event={"ID":"65035c1b-9a6b-466e-91cc-a603c7b56ea5","Type":"ContainerStarted","Data":"e3f9931c6fbfd123fbbf3c5e2b8d609322fa15a9ec1d48218d0cc3ebed79ec3c"} Dec 01 15:19:50 crc kubenswrapper[4931]: I1201 15:19:50.349540 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cmc5n" event={"ID":"658da6f1-ac70-4d83-83ca-f79e69f0979d","Type":"ContainerStarted","Data":"5adc16cb91c1386b583c653b6632be1423676e99f137a3df706caa3fdad7854f"} Dec 01 15:19:50 crc kubenswrapper[4931]: I1201 15:19:50.576748 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-588bb94b89-fmwv2"] Dec 01 15:19:50 crc kubenswrapper[4931]: I1201 15:19:50.954876 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:50.998439 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-6q6f9"] Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.032843 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.073021 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-ovsdbserver-nb\") pod \"df709027-75b2-45a3-a038-08931bbdc703\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.073062 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-config\") pod \"df709027-75b2-45a3-a038-08931bbdc703\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.073180 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2ltm\" (UniqueName: \"kubernetes.io/projected/df709027-75b2-45a3-a038-08931bbdc703-kube-api-access-j2ltm\") pod \"df709027-75b2-45a3-a038-08931bbdc703\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.073298 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-dns-svc\") pod \"df709027-75b2-45a3-a038-08931bbdc703\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.073406 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-ovsdbserver-sb\") pod \"df709027-75b2-45a3-a038-08931bbdc703\" (UID: \"df709027-75b2-45a3-a038-08931bbdc703\") " Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.087665 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bd7477fc7-x2lt5"] Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.094518 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9qzq8"] Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.095869 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df709027-75b2-45a3-a038-08931bbdc703-kube-api-access-j2ltm" (OuterVolumeSpecName: "kube-api-access-j2ltm") pod "df709027-75b2-45a3-a038-08931bbdc703" (UID: "df709027-75b2-45a3-a038-08931bbdc703"). InnerVolumeSpecName "kube-api-access-j2ltm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.136025 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-config" (OuterVolumeSpecName: "config") pod "df709027-75b2-45a3-a038-08931bbdc703" (UID: "df709027-75b2-45a3-a038-08931bbdc703"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.143901 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df709027-75b2-45a3-a038-08931bbdc703" (UID: "df709027-75b2-45a3-a038-08931bbdc703"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.146326 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df709027-75b2-45a3-a038-08931bbdc703" (UID: "df709027-75b2-45a3-a038-08931bbdc703"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.157067 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df709027-75b2-45a3-a038-08931bbdc703" (UID: "df709027-75b2-45a3-a038-08931bbdc703"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.180159 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.180326 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.180438 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.180564 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df709027-75b2-45a3-a038-08931bbdc703-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.180637 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2ltm\" (UniqueName: \"kubernetes.io/projected/df709027-75b2-45a3-a038-08931bbdc703-kube-api-access-j2ltm\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.292046 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ffphc"] Dec 01 15:19:51 crc kubenswrapper[4931]: W1201 15:19:51.300002 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda79a9139_dcea_4b3f_83dc_a1715f087ac5.slice/crio-1cbe00203cd76d9bdbf62f2370529250bc82e5aee7b57036d781006a8e0a95f2 WatchSource:0}: Error finding container 1cbe00203cd76d9bdbf62f2370529250bc82e5aee7b57036d781006a8e0a95f2: Status 404 returned error can't find the container with id 1cbe00203cd76d9bdbf62f2370529250bc82e5aee7b57036d781006a8e0a95f2 Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.301198 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-qsr4v"] Dec 01 15:19:51 crc kubenswrapper[4931]: W1201 15:19:51.305313 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38f515fe_6925_463c_b5dc_87b23d360ec5.slice/crio-92e40620aea4ac001f62cf6b285b3e1cb5403fd0c7c98cb361043b628cd588ed WatchSource:0}: Error finding container 92e40620aea4ac001f62cf6b285b3e1cb5403fd0c7c98cb361043b628cd588ed: Status 404 returned error can't find the container with id 92e40620aea4ac001f62cf6b285b3e1cb5403fd0c7c98cb361043b628cd588ed Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.363089 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9qzq8" event={"ID":"f9ada3f9-8074-48c1-a190-d6535e26e14f","Type":"ContainerStarted","Data":"62e26de25f4e45ec1dd95fc4d57b3fd2474c0eb0b45e67d57e026a25c974903c"} Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.363142 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9qzq8" event={"ID":"f9ada3f9-8074-48c1-a190-d6535e26e14f","Type":"ContainerStarted","Data":"b49186fa60654394f63e110187c1f0b6e4827072d171a189dfc5cb43a545cf85"} Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.367226 4931 generic.go:334] "Generic (PLEG): container finished" podID="99c8e9db-f582-408e-a3ba-a0b583c3218c" containerID="a3aeec915025240539a85f2ce1224ff52450238dbd0a3b5faba20328794f396b" exitCode=0 Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.367325 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-6q6f9" event={"ID":"99c8e9db-f582-408e-a3ba-a0b583c3218c","Type":"ContainerDied","Data":"a3aeec915025240539a85f2ce1224ff52450238dbd0a3b5faba20328794f396b"} Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.367350 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-6q6f9" event={"ID":"99c8e9db-f582-408e-a3ba-a0b583c3218c","Type":"ContainerStarted","Data":"63bb888d4f64e8c6f3e13a3052f20ce010e0ba291daa7243c451e7faa1e137d8"} Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.369500 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd7477fc7-x2lt5" event={"ID":"1330f183-8546-45e4-8ad8-9ec6eb7affa6","Type":"ContainerStarted","Data":"3a18193436b7a9ec8dbc807cdf0c9210631a7a660901cebbb0a7d376e10e7798"} Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.370668 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1dea62-ecc2-401d-b436-acc91fba2d5d","Type":"ContainerStarted","Data":"b7815eac10028557d88d23ec8ed3d80680d38060787c776b6694afb2da3f62a8"} Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.374221 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-588bb94b89-fmwv2" event={"ID":"f96c96c0-f0e9-45f6-82db-984fb2aec67f","Type":"ContainerStarted","Data":"5f248700e3cb416cb7f09ca06722dd20bdb029055effdd142cbfe51c1fd2672d"} Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.384084 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ffphc" event={"ID":"38f515fe-6925-463c-b5dc-87b23d360ec5","Type":"ContainerStarted","Data":"92e40620aea4ac001f62cf6b285b3e1cb5403fd0c7c98cb361043b628cd588ed"} Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.389878 4931 generic.go:334] "Generic (PLEG): container finished" podID="acebf535-ac84-4676-adc3-cc9638561419" containerID="3ad492fdecb4e62ca7d198fb77e0dc2c3f451145b8bce1aa2fe8a624e3c1aa9c" exitCode=0 Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.389950 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" event={"ID":"acebf535-ac84-4676-adc3-cc9638561419","Type":"ContainerDied","Data":"3ad492fdecb4e62ca7d198fb77e0dc2c3f451145b8bce1aa2fe8a624e3c1aa9c"} Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.394261 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k4w5f" event={"ID":"65035c1b-9a6b-466e-91cc-a603c7b56ea5","Type":"ContainerStarted","Data":"f661bb563ead0bf260e8494ec73c509e32bae2f6ca65512ec8607f11dc2c0d22"} Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.435176 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-9qzq8" podStartSLOduration=4.43515333 podStartE2EDuration="4.43515333s" podCreationTimestamp="2025-12-01 15:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:51.380439233 +0000 UTC m=+1137.806312930" watchObservedRunningTime="2025-12-01 15:19:51.43515333 +0000 UTC m=+1137.861026997" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.445906 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-flztw" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.446005 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-flztw" event={"ID":"df709027-75b2-45a3-a038-08931bbdc703","Type":"ContainerDied","Data":"dca066c0336338634012ee7a5bc8e0484b15a1ad335caac9d51c904870015072"} Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.446085 4931 scope.go:117] "RemoveContainer" containerID="0faf22106e8c73826a44ea93389ffa98f0b56d2509170c108da2899582ee0845" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.451962 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qsr4v" event={"ID":"a79a9139-dcea-4b3f-83dc-a1715f087ac5","Type":"ContainerStarted","Data":"1cbe00203cd76d9bdbf62f2370529250bc82e5aee7b57036d781006a8e0a95f2"} Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.503059 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68669fb557-g4zdb"] Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.503572 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-k4w5f" podStartSLOduration=5.503560602 podStartE2EDuration="5.503560602s" podCreationTimestamp="2025-12-01 15:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:51.452308952 +0000 UTC m=+1137.878182619" watchObservedRunningTime="2025-12-01 15:19:51.503560602 +0000 UTC m=+1137.929434269" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.542038 4931 scope.go:117] "RemoveContainer" containerID="cfca232bc10f0d831f2e153a96c1aaba1c09a3b4dc28fb5b6c187dc2376fdc64" Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.551063 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-flztw"] Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.578085 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-flztw"] Dec 01 15:19:51 crc kubenswrapper[4931]: W1201 15:19:51.609893 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod174578a2_050b_44a8_9f8c_75d918589c78.slice/crio-8054530f5ae47953e3f8e18a2f1b7634712e91e2d035d7b89f68bf5ea9a4cbbf WatchSource:0}: Error finding container 8054530f5ae47953e3f8e18a2f1b7634712e91e2d035d7b89f68bf5ea9a4cbbf: Status 404 returned error can't find the container with id 8054530f5ae47953e3f8e18a2f1b7634712e91e2d035d7b89f68bf5ea9a4cbbf Dec 01 15:19:51 crc kubenswrapper[4931]: I1201 15:19:51.618779 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.255311 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df709027-75b2-45a3-a038-08931bbdc703" path="/var/lib/kubelet/pods/df709027-75b2-45a3-a038-08931bbdc703/volumes" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.285786 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.424074 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-ovsdbserver-nb\") pod \"acebf535-ac84-4676-adc3-cc9638561419\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.424188 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-config\") pod \"acebf535-ac84-4676-adc3-cc9638561419\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.424328 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnjjs\" (UniqueName: \"kubernetes.io/projected/acebf535-ac84-4676-adc3-cc9638561419-kube-api-access-wnjjs\") pod \"acebf535-ac84-4676-adc3-cc9638561419\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.424427 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-dns-svc\") pod \"acebf535-ac84-4676-adc3-cc9638561419\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.424452 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-ovsdbserver-sb\") pod \"acebf535-ac84-4676-adc3-cc9638561419\" (UID: \"acebf535-ac84-4676-adc3-cc9638561419\") " Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.435028 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acebf535-ac84-4676-adc3-cc9638561419-kube-api-access-wnjjs" (OuterVolumeSpecName: "kube-api-access-wnjjs") pod "acebf535-ac84-4676-adc3-cc9638561419" (UID: "acebf535-ac84-4676-adc3-cc9638561419"). InnerVolumeSpecName "kube-api-access-wnjjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.453498 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-config" (OuterVolumeSpecName: "config") pod "acebf535-ac84-4676-adc3-cc9638561419" (UID: "acebf535-ac84-4676-adc3-cc9638561419"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.494479 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "acebf535-ac84-4676-adc3-cc9638561419" (UID: "acebf535-ac84-4676-adc3-cc9638561419"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.501712 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-6q6f9" event={"ID":"99c8e9db-f582-408e-a3ba-a0b583c3218c","Type":"ContainerStarted","Data":"4bc915050d6b0979096f7179f32d5769ae62fc6d2cb5cf43b1cb63e4bc62ad8e"} Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.502844 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.522937 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "acebf535-ac84-4676-adc3-cc9638561419" (UID: "acebf535-ac84-4676-adc3-cc9638561419"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.526660 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnjjs\" (UniqueName: \"kubernetes.io/projected/acebf535-ac84-4676-adc3-cc9638561419-kube-api-access-wnjjs\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.526693 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.526707 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.526721 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.533899 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "acebf535-ac84-4676-adc3-cc9638561419" (UID: "acebf535-ac84-4676-adc3-cc9638561419"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.566724 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" event={"ID":"acebf535-ac84-4676-adc3-cc9638561419","Type":"ContainerDied","Data":"1ed0a7f2c6f90a19ed11b2627211dbd8c52706007429fe1335c7150e2f3a6107"} Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.566991 4931 scope.go:117] "RemoveContainer" containerID="3ad492fdecb4e62ca7d198fb77e0dc2c3f451145b8bce1aa2fe8a624e3c1aa9c" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.567212 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5679f497-2bbm9" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.604431 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56798b757f-6q6f9" podStartSLOduration=6.604412106 podStartE2EDuration="6.604412106s" podCreationTimestamp="2025-12-01 15:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:52.567759397 +0000 UTC m=+1138.993633064" watchObservedRunningTime="2025-12-01 15:19:52.604412106 +0000 UTC m=+1139.030285773" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.622532 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"174578a2-050b-44a8-9f8c-75d918589c78","Type":"ContainerStarted","Data":"8054530f5ae47953e3f8e18a2f1b7634712e91e2d035d7b89f68bf5ea9a4cbbf"} Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.634571 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/acebf535-ac84-4676-adc3-cc9638561419-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.660954 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.709440 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68669fb557-g4zdb" event={"ID":"19efc2e2-7d7e-455c-b966-525f140a2dad","Type":"ContainerStarted","Data":"2713b0106d037ae51c206465fcd0d1d1e3695bb69f6381834c623445c09beb26"} Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.735417 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5679f497-2bbm9"] Dec 01 15:19:52 crc kubenswrapper[4931]: I1201 15:19:52.748591 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d5679f497-2bbm9"] Dec 01 15:19:53 crc kubenswrapper[4931]: I1201 15:19:53.747025 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"9d66aa92d66ee4a26bccb6500988c6fc4f41436ea767735d6535f4ba3150aed1"} Dec 01 15:19:53 crc kubenswrapper[4931]: I1201 15:19:53.753152 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"174578a2-050b-44a8-9f8c-75d918589c78","Type":"ContainerStarted","Data":"15f9cb36f6d653ce9e721db1bcf38fc6a4039d817385c223e9d7107ae75e02f9"} Dec 01 15:19:53 crc kubenswrapper[4931]: I1201 15:19:53.754889 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c07769f8-f78c-4248-b590-f916b5362cd3","Type":"ContainerStarted","Data":"79fbb0f34c99bd20a5d2a68dc80e7a756df5ea8a0209a1ff0cc52e59331be07f"} Dec 01 15:19:54 crc kubenswrapper[4931]: I1201 15:19:54.262369 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acebf535-ac84-4676-adc3-cc9638561419" path="/var/lib/kubelet/pods/acebf535-ac84-4676-adc3-cc9638561419/volumes" Dec 01 15:19:54 crc kubenswrapper[4931]: I1201 15:19:54.765157 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c07769f8-f78c-4248-b590-f916b5362cd3","Type":"ContainerStarted","Data":"88c793e28a432110ff33d602ec42af1a1db711efcd7d7958a44ed402b356ce93"} Dec 01 15:19:54 crc kubenswrapper[4931]: I1201 15:19:54.781706 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"2b709d84cf4841769237dc6f7ff1205951c07609b5e2dcbe1b540c1c3afa6270"} Dec 01 15:19:55 crc kubenswrapper[4931]: I1201 15:19:55.793110 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"174578a2-050b-44a8-9f8c-75d918589c78","Type":"ContainerStarted","Data":"6418cf119c9faa1901b2d0d27709caca64b33694788e403f87f151282873395c"} Dec 01 15:19:55 crc kubenswrapper[4931]: I1201 15:19:55.793239 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="174578a2-050b-44a8-9f8c-75d918589c78" containerName="glance-log" containerID="cri-o://15f9cb36f6d653ce9e721db1bcf38fc6a4039d817385c223e9d7107ae75e02f9" gracePeriod=30 Dec 01 15:19:55 crc kubenswrapper[4931]: I1201 15:19:55.793507 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="174578a2-050b-44a8-9f8c-75d918589c78" containerName="glance-httpd" containerID="cri-o://6418cf119c9faa1901b2d0d27709caca64b33694788e403f87f151282873395c" gracePeriod=30 Dec 01 15:19:55 crc kubenswrapper[4931]: I1201 15:19:55.823936 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.823915457 podStartE2EDuration="8.823915457s" podCreationTimestamp="2025-12-01 15:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:19:55.816219601 +0000 UTC m=+1142.242093278" watchObservedRunningTime="2025-12-01 15:19:55.823915457 +0000 UTC m=+1142.249789134" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.232704 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68669fb557-g4zdb"] Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.267279 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6479b7c68-txrvx"] Dec 01 15:19:56 crc kubenswrapper[4931]: E1201 15:19:56.272485 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df709027-75b2-45a3-a038-08931bbdc703" containerName="init" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.272525 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="df709027-75b2-45a3-a038-08931bbdc703" containerName="init" Dec 01 15:19:56 crc kubenswrapper[4931]: E1201 15:19:56.272541 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acebf535-ac84-4676-adc3-cc9638561419" containerName="init" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.272548 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="acebf535-ac84-4676-adc3-cc9638561419" containerName="init" Dec 01 15:19:56 crc kubenswrapper[4931]: E1201 15:19:56.272570 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df709027-75b2-45a3-a038-08931bbdc703" containerName="dnsmasq-dns" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.272578 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="df709027-75b2-45a3-a038-08931bbdc703" containerName="dnsmasq-dns" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.272797 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="acebf535-ac84-4676-adc3-cc9638561419" containerName="init" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.272829 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="df709027-75b2-45a3-a038-08931bbdc703" containerName="dnsmasq-dns" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.273921 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.276431 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.292647 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6479b7c68-txrvx"] Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.325850 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97ed61f3-8ca0-4aee-afae-168398babe70-config-data\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.325896 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-combined-ca-bundle\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.325929 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-horizon-tls-certs\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.325954 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-horizon-secret-key\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.325973 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97ed61f3-8ca0-4aee-afae-168398babe70-scripts\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.325996 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2l7x\" (UniqueName: \"kubernetes.io/projected/97ed61f3-8ca0-4aee-afae-168398babe70-kube-api-access-m2l7x\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.326038 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ed61f3-8ca0-4aee-afae-168398babe70-logs\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.338373 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bd7477fc7-x2lt5"] Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.353279 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-65c944c654-l6mmj"] Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.358634 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.371107 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65c944c654-l6mmj"] Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.427884 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a2f9f3b-603b-4004-8e6f-dce5b810785c-config-data\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.427969 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a2f9f3b-603b-4004-8e6f-dce5b810785c-horizon-tls-certs\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.428009 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a2f9f3b-603b-4004-8e6f-dce5b810785c-scripts\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.428031 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a2f9f3b-603b-4004-8e6f-dce5b810785c-logs\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.428079 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97ed61f3-8ca0-4aee-afae-168398babe70-config-data\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.428108 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-combined-ca-bundle\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.428149 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-horizon-tls-certs\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.428181 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-horizon-secret-key\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.428205 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97ed61f3-8ca0-4aee-afae-168398babe70-scripts\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.428243 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2l7x\" (UniqueName: \"kubernetes.io/projected/97ed61f3-8ca0-4aee-afae-168398babe70-kube-api-access-m2l7x\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.428270 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48flh\" (UniqueName: \"kubernetes.io/projected/1a2f9f3b-603b-4004-8e6f-dce5b810785c-kube-api-access-48flh\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.428305 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a2f9f3b-603b-4004-8e6f-dce5b810785c-combined-ca-bundle\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.428353 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a2f9f3b-603b-4004-8e6f-dce5b810785c-horizon-secret-key\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.428401 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ed61f3-8ca0-4aee-afae-168398babe70-logs\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.433106 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97ed61f3-8ca0-4aee-afae-168398babe70-config-data\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.435327 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ed61f3-8ca0-4aee-afae-168398babe70-logs\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.438423 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97ed61f3-8ca0-4aee-afae-168398babe70-scripts\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.439536 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-horizon-secret-key\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.447175 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-combined-ca-bundle\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.449551 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2l7x\" (UniqueName: \"kubernetes.io/projected/97ed61f3-8ca0-4aee-afae-168398babe70-kube-api-access-m2l7x\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.460225 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-horizon-tls-certs\") pod \"horizon-6479b7c68-txrvx\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.537852 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48flh\" (UniqueName: \"kubernetes.io/projected/1a2f9f3b-603b-4004-8e6f-dce5b810785c-kube-api-access-48flh\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.538223 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a2f9f3b-603b-4004-8e6f-dce5b810785c-combined-ca-bundle\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.538305 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a2f9f3b-603b-4004-8e6f-dce5b810785c-horizon-secret-key\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.538417 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a2f9f3b-603b-4004-8e6f-dce5b810785c-config-data\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.538452 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a2f9f3b-603b-4004-8e6f-dce5b810785c-horizon-tls-certs\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.538503 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a2f9f3b-603b-4004-8e6f-dce5b810785c-scripts\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.538523 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a2f9f3b-603b-4004-8e6f-dce5b810785c-logs\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.539224 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a2f9f3b-603b-4004-8e6f-dce5b810785c-logs\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.542552 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a2f9f3b-603b-4004-8e6f-dce5b810785c-scripts\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.543888 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a2f9f3b-603b-4004-8e6f-dce5b810785c-config-data\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.558740 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a2f9f3b-603b-4004-8e6f-dce5b810785c-horizon-secret-key\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.559045 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a2f9f3b-603b-4004-8e6f-dce5b810785c-combined-ca-bundle\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.559525 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a2f9f3b-603b-4004-8e6f-dce5b810785c-horizon-tls-certs\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.570247 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48flh\" (UniqueName: \"kubernetes.io/projected/1a2f9f3b-603b-4004-8e6f-dce5b810785c-kube-api-access-48flh\") pod \"horizon-65c944c654-l6mmj\" (UID: \"1a2f9f3b-603b-4004-8e6f-dce5b810785c\") " pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.600134 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.674317 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.801488 4931 generic.go:334] "Generic (PLEG): container finished" podID="65035c1b-9a6b-466e-91cc-a603c7b56ea5" containerID="f661bb563ead0bf260e8494ec73c509e32bae2f6ca65512ec8607f11dc2c0d22" exitCode=0 Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.801558 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k4w5f" event={"ID":"65035c1b-9a6b-466e-91cc-a603c7b56ea5","Type":"ContainerDied","Data":"f661bb563ead0bf260e8494ec73c509e32bae2f6ca65512ec8607f11dc2c0d22"} Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.804006 4931 generic.go:334] "Generic (PLEG): container finished" podID="174578a2-050b-44a8-9f8c-75d918589c78" containerID="6418cf119c9faa1901b2d0d27709caca64b33694788e403f87f151282873395c" exitCode=0 Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.804031 4931 generic.go:334] "Generic (PLEG): container finished" podID="174578a2-050b-44a8-9f8c-75d918589c78" containerID="15f9cb36f6d653ce9e721db1bcf38fc6a4039d817385c223e9d7107ae75e02f9" exitCode=143 Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.804077 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"174578a2-050b-44a8-9f8c-75d918589c78","Type":"ContainerDied","Data":"6418cf119c9faa1901b2d0d27709caca64b33694788e403f87f151282873395c"} Dec 01 15:19:56 crc kubenswrapper[4931]: I1201 15:19:56.804174 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"174578a2-050b-44a8-9f8c-75d918589c78","Type":"ContainerDied","Data":"15f9cb36f6d653ce9e721db1bcf38fc6a4039d817385c223e9d7107ae75e02f9"} Dec 01 15:19:57 crc kubenswrapper[4931]: I1201 15:19:57.640064 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:19:57 crc kubenswrapper[4931]: I1201 15:19:57.705150 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zvwkt"] Dec 01 15:19:57 crc kubenswrapper[4931]: I1201 15:19:57.705769 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" podUID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerName="dnsmasq-dns" containerID="cri-o://8e8e3d00dd4510d8c026119eb82ad7d32245c7b05317be8aad83da5e86e754ef" gracePeriod=10 Dec 01 15:19:58 crc kubenswrapper[4931]: I1201 15:19:58.932980 4931 generic.go:334] "Generic (PLEG): container finished" podID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerID="8e8e3d00dd4510d8c026119eb82ad7d32245c7b05317be8aad83da5e86e754ef" exitCode=0 Dec 01 15:19:58 crc kubenswrapper[4931]: I1201 15:19:58.933030 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" event={"ID":"f61f6fb9-c1c7-4feb-a851-711b9f093142","Type":"ContainerDied","Data":"8e8e3d00dd4510d8c026119eb82ad7d32245c7b05317be8aad83da5e86e754ef"} Dec 01 15:20:01 crc kubenswrapper[4931]: I1201 15:20:01.746226 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" podUID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Dec 01 15:20:06 crc kubenswrapper[4931]: I1201 15:20:06.745672 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" podUID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Dec 01 15:20:11 crc kubenswrapper[4931]: I1201 15:20:11.054098 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"32d885cd688b6c0582dfe7d283cc3525fe35214c88add3f923f138e1e2c42b82"} Dec 01 15:20:11 crc kubenswrapper[4931]: E1201 15:20:11.298373 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 01 15:20:11 crc kubenswrapper[4931]: E1201 15:20:11.298555 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n98h56ch77h569h544h7fh656h64h87h56bh9bh57dh75hbdh57ch576h697h579h68ch548h5b9h5bdh58ch57dh57hbchf5h595h94h586h564h5bbq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vcb26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-588bb94b89-fmwv2_openstack(f96c96c0-f0e9-45f6-82db-984fb2aec67f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:20:11 crc kubenswrapper[4931]: E1201 15:20:11.301638 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-588bb94b89-fmwv2" podUID="f96c96c0-f0e9-45f6-82db-984fb2aec67f" Dec 01 15:20:11 crc kubenswrapper[4931]: I1201 15:20:11.745245 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" podUID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Dec 01 15:20:11 crc kubenswrapper[4931]: I1201 15:20:11.745685 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:20:12 crc kubenswrapper[4931]: E1201 15:20:12.725034 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 01 15:20:12 crc kubenswrapper[4931]: E1201 15:20:12.725266 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n579hc7h5b8h9bh67fh54fh585h557h599h5d7hb4h59fh675h59fhd4h55fh654h57dh66h56ch5d7h5d7h67fh585h549h698h685hd6h5cdh65ch59h9bq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wtrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6e1dea62-ecc2-401d-b436-acc91fba2d5d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:20:12 crc kubenswrapper[4931]: E1201 15:20:12.739965 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 01 15:20:12 crc kubenswrapper[4931]: E1201 15:20:12.740176 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n647h64chcch5bfh647h5d8hd5h697h576hd8h554h64dh695h546h654h67bh555h54h698h55h69hc5h5f7h54chb8h685h59bhd4hbbh5bbh5f9h64fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4bsfz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-bd7477fc7-x2lt5_openstack(1330f183-8546-45e4-8ad8-9ec6eb7affa6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:20:12 crc kubenswrapper[4931]: E1201 15:20:12.744709 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-bd7477fc7-x2lt5" podUID="1330f183-8546-45e4-8ad8-9ec6eb7affa6" Dec 01 15:20:12 crc kubenswrapper[4931]: E1201 15:20:12.773702 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 01 15:20:12 crc kubenswrapper[4931]: E1201 15:20:12.773956 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f7h557h54dh546h674h67bhb6h588h9ch57fhfbh58dh54h687hdh68h99h596hcdh8h588h8bh5d5h99h67bh5f5h8ch5fh589h597hf4h57fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jd2rd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-68669fb557-g4zdb_openstack(19efc2e2-7d7e-455c-b966-525f140a2dad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:20:12 crc kubenswrapper[4931]: E1201 15:20:12.776974 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-68669fb557-g4zdb" podUID="19efc2e2-7d7e-455c-b966-525f140a2dad" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.808263 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.822452 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.916010 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-credential-keys\") pod \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.916443 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f96c96c0-f0e9-45f6-82db-984fb2aec67f-logs\") pod \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.916467 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f96c96c0-f0e9-45f6-82db-984fb2aec67f-scripts\") pod \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.916486 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f96c96c0-f0e9-45f6-82db-984fb2aec67f-config-data\") pod \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.916571 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-fernet-keys\") pod \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.916596 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-scripts\") pod \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.916631 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-config-data\") pod \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.916727 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfkqj\" (UniqueName: \"kubernetes.io/projected/65035c1b-9a6b-466e-91cc-a603c7b56ea5-kube-api-access-nfkqj\") pod \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.916792 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f96c96c0-f0e9-45f6-82db-984fb2aec67f-logs" (OuterVolumeSpecName: "logs") pod "f96c96c0-f0e9-45f6-82db-984fb2aec67f" (UID: "f96c96c0-f0e9-45f6-82db-984fb2aec67f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.916846 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcb26\" (UniqueName: \"kubernetes.io/projected/f96c96c0-f0e9-45f6-82db-984fb2aec67f-kube-api-access-vcb26\") pod \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.916884 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f96c96c0-f0e9-45f6-82db-984fb2aec67f-horizon-secret-key\") pod \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\" (UID: \"f96c96c0-f0e9-45f6-82db-984fb2aec67f\") " Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.916918 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-combined-ca-bundle\") pod \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\" (UID: \"65035c1b-9a6b-466e-91cc-a603c7b56ea5\") " Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.917302 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96c96c0-f0e9-45f6-82db-984fb2aec67f-config-data" (OuterVolumeSpecName: "config-data") pod "f96c96c0-f0e9-45f6-82db-984fb2aec67f" (UID: "f96c96c0-f0e9-45f6-82db-984fb2aec67f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.917326 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f96c96c0-f0e9-45f6-82db-984fb2aec67f-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.918024 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96c96c0-f0e9-45f6-82db-984fb2aec67f-scripts" (OuterVolumeSpecName: "scripts") pod "f96c96c0-f0e9-45f6-82db-984fb2aec67f" (UID: "f96c96c0-f0e9-45f6-82db-984fb2aec67f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.924262 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f96c96c0-f0e9-45f6-82db-984fb2aec67f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f96c96c0-f0e9-45f6-82db-984fb2aec67f" (UID: "f96c96c0-f0e9-45f6-82db-984fb2aec67f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.924625 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "65035c1b-9a6b-466e-91cc-a603c7b56ea5" (UID: "65035c1b-9a6b-466e-91cc-a603c7b56ea5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.924905 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-scripts" (OuterVolumeSpecName: "scripts") pod "65035c1b-9a6b-466e-91cc-a603c7b56ea5" (UID: "65035c1b-9a6b-466e-91cc-a603c7b56ea5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.924875 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f96c96c0-f0e9-45f6-82db-984fb2aec67f-kube-api-access-vcb26" (OuterVolumeSpecName: "kube-api-access-vcb26") pod "f96c96c0-f0e9-45f6-82db-984fb2aec67f" (UID: "f96c96c0-f0e9-45f6-82db-984fb2aec67f"). InnerVolumeSpecName "kube-api-access-vcb26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.935605 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "65035c1b-9a6b-466e-91cc-a603c7b56ea5" (UID: "65035c1b-9a6b-466e-91cc-a603c7b56ea5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.942842 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65035c1b-9a6b-466e-91cc-a603c7b56ea5-kube-api-access-nfkqj" (OuterVolumeSpecName: "kube-api-access-nfkqj") pod "65035c1b-9a6b-466e-91cc-a603c7b56ea5" (UID: "65035c1b-9a6b-466e-91cc-a603c7b56ea5"). InnerVolumeSpecName "kube-api-access-nfkqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.952414 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-config-data" (OuterVolumeSpecName: "config-data") pod "65035c1b-9a6b-466e-91cc-a603c7b56ea5" (UID: "65035c1b-9a6b-466e-91cc-a603c7b56ea5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:14 crc kubenswrapper[4931]: I1201 15:20:14.970547 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65035c1b-9a6b-466e-91cc-a603c7b56ea5" (UID: "65035c1b-9a6b-466e-91cc-a603c7b56ea5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.018402 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.018444 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfkqj\" (UniqueName: \"kubernetes.io/projected/65035c1b-9a6b-466e-91cc-a603c7b56ea5-kube-api-access-nfkqj\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.018456 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcb26\" (UniqueName: \"kubernetes.io/projected/f96c96c0-f0e9-45f6-82db-984fb2aec67f-kube-api-access-vcb26\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.018465 4931 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f96c96c0-f0e9-45f6-82db-984fb2aec67f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.018477 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.018485 4931 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.018494 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f96c96c0-f0e9-45f6-82db-984fb2aec67f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.018502 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f96c96c0-f0e9-45f6-82db-984fb2aec67f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.018509 4931 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.018517 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65035c1b-9a6b-466e-91cc-a603c7b56ea5-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.100912 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-588bb94b89-fmwv2" event={"ID":"f96c96c0-f0e9-45f6-82db-984fb2aec67f","Type":"ContainerDied","Data":"5f248700e3cb416cb7f09ca06722dd20bdb029055effdd142cbfe51c1fd2672d"} Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.100994 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-588bb94b89-fmwv2" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.109246 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k4w5f" event={"ID":"65035c1b-9a6b-466e-91cc-a603c7b56ea5","Type":"ContainerDied","Data":"e3f9931c6fbfd123fbbf3c5e2b8d609322fa15a9ec1d48218d0cc3ebed79ec3c"} Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.109278 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3f9931c6fbfd123fbbf3c5e2b8d609322fa15a9ec1d48218d0cc3ebed79ec3c" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.109338 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k4w5f" Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.167338 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-588bb94b89-fmwv2"] Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.179204 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-588bb94b89-fmwv2"] Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.897104 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-k4w5f"] Dec 01 15:20:15 crc kubenswrapper[4931]: I1201 15:20:15.922416 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-k4w5f"] Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.006861 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-x5lvn"] Dec 01 15:20:16 crc kubenswrapper[4931]: E1201 15:20:16.007280 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65035c1b-9a6b-466e-91cc-a603c7b56ea5" containerName="keystone-bootstrap" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.007297 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="65035c1b-9a6b-466e-91cc-a603c7b56ea5" containerName="keystone-bootstrap" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.007526 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="65035c1b-9a6b-466e-91cc-a603c7b56ea5" containerName="keystone-bootstrap" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.008121 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.011280 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.011741 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.013344 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x5lvn"] Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.013612 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.013751 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w5fnq" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.013870 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.140127 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj87b\" (UniqueName: \"kubernetes.io/projected/04a78a28-a980-4393-bd41-e0f523fccf7e-kube-api-access-sj87b\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.140188 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-credential-keys\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.140221 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-scripts\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.140312 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-fernet-keys\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.140458 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-config-data\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.140485 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-combined-ca-bundle\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.242100 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-fernet-keys\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.242218 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-config-data\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.242246 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-combined-ca-bundle\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.243113 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj87b\" (UniqueName: \"kubernetes.io/projected/04a78a28-a980-4393-bd41-e0f523fccf7e-kube-api-access-sj87b\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.243458 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-credential-keys\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.243486 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-scripts\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.249805 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-scripts\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.250066 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-credential-keys\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.250201 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-config-data\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.252927 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-fernet-keys\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.253470 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-combined-ca-bundle\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.254012 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65035c1b-9a6b-466e-91cc-a603c7b56ea5" path="/var/lib/kubelet/pods/65035c1b-9a6b-466e-91cc-a603c7b56ea5/volumes" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.254733 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f96c96c0-f0e9-45f6-82db-984fb2aec67f" path="/var/lib/kubelet/pods/f96c96c0-f0e9-45f6-82db-984fb2aec67f/volumes" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.260684 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj87b\" (UniqueName: \"kubernetes.io/projected/04a78a28-a980-4393-bd41-e0f523fccf7e-kube-api-access-sj87b\") pod \"keystone-bootstrap-x5lvn\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:16 crc kubenswrapper[4931]: I1201 15:20:16.333183 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:17 crc kubenswrapper[4931]: I1201 15:20:17.891019 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:17 crc kubenswrapper[4931]: I1201 15:20:17.891069 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:21 crc kubenswrapper[4931]: I1201 15:20:21.746077 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" podUID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Dec 01 15:20:26 crc kubenswrapper[4931]: I1201 15:20:26.747407 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" podUID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.218155 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68669fb557-g4zdb" event={"ID":"19efc2e2-7d7e-455c-b966-525f140a2dad","Type":"ContainerDied","Data":"2713b0106d037ae51c206465fcd0d1d1e3695bb69f6381834c623445c09beb26"} Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.218521 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2713b0106d037ae51c206465fcd0d1d1e3695bb69f6381834c623445c09beb26" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.220397 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd7477fc7-x2lt5" event={"ID":"1330f183-8546-45e4-8ad8-9ec6eb7affa6","Type":"ContainerDied","Data":"3a18193436b7a9ec8dbc807cdf0c9210631a7a660901cebbb0a7d376e10e7798"} Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.220424 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a18193436b7a9ec8dbc807cdf0c9210631a7a660901cebbb0a7d376e10e7798" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.222338 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"174578a2-050b-44a8-9f8c-75d918589c78","Type":"ContainerDied","Data":"8054530f5ae47953e3f8e18a2f1b7634712e91e2d035d7b89f68bf5ea9a4cbbf"} Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.222365 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8054530f5ae47953e3f8e18a2f1b7634712e91e2d035d7b89f68bf5ea9a4cbbf" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.264694 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.270161 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.276732 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400268 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-scripts\") pod \"174578a2-050b-44a8-9f8c-75d918589c78\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400337 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/174578a2-050b-44a8-9f8c-75d918589c78-logs\") pod \"174578a2-050b-44a8-9f8c-75d918589c78\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400370 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-internal-tls-certs\") pod \"174578a2-050b-44a8-9f8c-75d918589c78\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400409 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"174578a2-050b-44a8-9f8c-75d918589c78\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400459 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l6q6\" (UniqueName: \"kubernetes.io/projected/174578a2-050b-44a8-9f8c-75d918589c78-kube-api-access-4l6q6\") pod \"174578a2-050b-44a8-9f8c-75d918589c78\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400474 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/174578a2-050b-44a8-9f8c-75d918589c78-httpd-run\") pod \"174578a2-050b-44a8-9f8c-75d918589c78\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400503 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-config-data\") pod \"174578a2-050b-44a8-9f8c-75d918589c78\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400533 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bsfz\" (UniqueName: \"kubernetes.io/projected/1330f183-8546-45e4-8ad8-9ec6eb7affa6-kube-api-access-4bsfz\") pod \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400566 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1330f183-8546-45e4-8ad8-9ec6eb7affa6-logs\") pod \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400590 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19efc2e2-7d7e-455c-b966-525f140a2dad-config-data\") pod \"19efc2e2-7d7e-455c-b966-525f140a2dad\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400652 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19efc2e2-7d7e-455c-b966-525f140a2dad-scripts\") pod \"19efc2e2-7d7e-455c-b966-525f140a2dad\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400676 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1330f183-8546-45e4-8ad8-9ec6eb7affa6-scripts\") pod \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400697 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd2rd\" (UniqueName: \"kubernetes.io/projected/19efc2e2-7d7e-455c-b966-525f140a2dad-kube-api-access-jd2rd\") pod \"19efc2e2-7d7e-455c-b966-525f140a2dad\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400714 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-combined-ca-bundle\") pod \"174578a2-050b-44a8-9f8c-75d918589c78\" (UID: \"174578a2-050b-44a8-9f8c-75d918589c78\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400755 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1330f183-8546-45e4-8ad8-9ec6eb7affa6-config-data\") pod \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400780 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19efc2e2-7d7e-455c-b966-525f140a2dad-horizon-secret-key\") pod \"19efc2e2-7d7e-455c-b966-525f140a2dad\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400809 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19efc2e2-7d7e-455c-b966-525f140a2dad-logs\") pod \"19efc2e2-7d7e-455c-b966-525f140a2dad\" (UID: \"19efc2e2-7d7e-455c-b966-525f140a2dad\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400824 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1330f183-8546-45e4-8ad8-9ec6eb7affa6-horizon-secret-key\") pod \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\" (UID: \"1330f183-8546-45e4-8ad8-9ec6eb7affa6\") " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.400919 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174578a2-050b-44a8-9f8c-75d918589c78-logs" (OuterVolumeSpecName: "logs") pod "174578a2-050b-44a8-9f8c-75d918589c78" (UID: "174578a2-050b-44a8-9f8c-75d918589c78"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.401188 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/174578a2-050b-44a8-9f8c-75d918589c78-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.401325 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174578a2-050b-44a8-9f8c-75d918589c78-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "174578a2-050b-44a8-9f8c-75d918589c78" (UID: "174578a2-050b-44a8-9f8c-75d918589c78"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.401343 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19efc2e2-7d7e-455c-b966-525f140a2dad-scripts" (OuterVolumeSpecName: "scripts") pod "19efc2e2-7d7e-455c-b966-525f140a2dad" (UID: "19efc2e2-7d7e-455c-b966-525f140a2dad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.401581 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1330f183-8546-45e4-8ad8-9ec6eb7affa6-logs" (OuterVolumeSpecName: "logs") pod "1330f183-8546-45e4-8ad8-9ec6eb7affa6" (UID: "1330f183-8546-45e4-8ad8-9ec6eb7affa6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.401781 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19efc2e2-7d7e-455c-b966-525f140a2dad-logs" (OuterVolumeSpecName: "logs") pod "19efc2e2-7d7e-455c-b966-525f140a2dad" (UID: "19efc2e2-7d7e-455c-b966-525f140a2dad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.401799 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1330f183-8546-45e4-8ad8-9ec6eb7affa6-scripts" (OuterVolumeSpecName: "scripts") pod "1330f183-8546-45e4-8ad8-9ec6eb7affa6" (UID: "1330f183-8546-45e4-8ad8-9ec6eb7affa6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.402020 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19efc2e2-7d7e-455c-b966-525f140a2dad-config-data" (OuterVolumeSpecName: "config-data") pod "19efc2e2-7d7e-455c-b966-525f140a2dad" (UID: "19efc2e2-7d7e-455c-b966-525f140a2dad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.402073 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1330f183-8546-45e4-8ad8-9ec6eb7affa6-config-data" (OuterVolumeSpecName: "config-data") pod "1330f183-8546-45e4-8ad8-9ec6eb7affa6" (UID: "1330f183-8546-45e4-8ad8-9ec6eb7affa6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.407224 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-scripts" (OuterVolumeSpecName: "scripts") pod "174578a2-050b-44a8-9f8c-75d918589c78" (UID: "174578a2-050b-44a8-9f8c-75d918589c78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.407659 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19efc2e2-7d7e-455c-b966-525f140a2dad-kube-api-access-jd2rd" (OuterVolumeSpecName: "kube-api-access-jd2rd") pod "19efc2e2-7d7e-455c-b966-525f140a2dad" (UID: "19efc2e2-7d7e-455c-b966-525f140a2dad"). InnerVolumeSpecName "kube-api-access-jd2rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.407648 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "174578a2-050b-44a8-9f8c-75d918589c78" (UID: "174578a2-050b-44a8-9f8c-75d918589c78"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.408176 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1330f183-8546-45e4-8ad8-9ec6eb7affa6-kube-api-access-4bsfz" (OuterVolumeSpecName: "kube-api-access-4bsfz") pod "1330f183-8546-45e4-8ad8-9ec6eb7affa6" (UID: "1330f183-8546-45e4-8ad8-9ec6eb7affa6"). InnerVolumeSpecName "kube-api-access-4bsfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.408583 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19efc2e2-7d7e-455c-b966-525f140a2dad-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "19efc2e2-7d7e-455c-b966-525f140a2dad" (UID: "19efc2e2-7d7e-455c-b966-525f140a2dad"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.409127 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1330f183-8546-45e4-8ad8-9ec6eb7affa6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1330f183-8546-45e4-8ad8-9ec6eb7affa6" (UID: "1330f183-8546-45e4-8ad8-9ec6eb7affa6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.411019 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/174578a2-050b-44a8-9f8c-75d918589c78-kube-api-access-4l6q6" (OuterVolumeSpecName: "kube-api-access-4l6q6") pod "174578a2-050b-44a8-9f8c-75d918589c78" (UID: "174578a2-050b-44a8-9f8c-75d918589c78"). InnerVolumeSpecName "kube-api-access-4l6q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.445596 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "174578a2-050b-44a8-9f8c-75d918589c78" (UID: "174578a2-050b-44a8-9f8c-75d918589c78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.453572 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "174578a2-050b-44a8-9f8c-75d918589c78" (UID: "174578a2-050b-44a8-9f8c-75d918589c78"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.460069 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-config-data" (OuterVolumeSpecName: "config-data") pod "174578a2-050b-44a8-9f8c-75d918589c78" (UID: "174578a2-050b-44a8-9f8c-75d918589c78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503320 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1330f183-8546-45e4-8ad8-9ec6eb7affa6-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503360 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd2rd\" (UniqueName: \"kubernetes.io/projected/19efc2e2-7d7e-455c-b966-525f140a2dad-kube-api-access-jd2rd\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503377 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503412 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1330f183-8546-45e4-8ad8-9ec6eb7affa6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503425 4931 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19efc2e2-7d7e-455c-b966-525f140a2dad-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503436 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19efc2e2-7d7e-455c-b966-525f140a2dad-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503447 4931 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1330f183-8546-45e4-8ad8-9ec6eb7affa6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503457 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503468 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503506 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503518 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l6q6\" (UniqueName: \"kubernetes.io/projected/174578a2-050b-44a8-9f8c-75d918589c78-kube-api-access-4l6q6\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503531 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/174578a2-050b-44a8-9f8c-75d918589c78-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503543 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/174578a2-050b-44a8-9f8c-75d918589c78-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503554 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bsfz\" (UniqueName: \"kubernetes.io/projected/1330f183-8546-45e4-8ad8-9ec6eb7affa6-kube-api-access-4bsfz\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503565 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1330f183-8546-45e4-8ad8-9ec6eb7affa6-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503576 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19efc2e2-7d7e-455c-b966-525f140a2dad-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.503586 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19efc2e2-7d7e-455c-b966-525f140a2dad-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.523425 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 01 15:20:29 crc kubenswrapper[4931]: I1201 15:20:29.604987 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.124256 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.231083 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" event={"ID":"f61f6fb9-c1c7-4feb-a851-711b9f093142","Type":"ContainerDied","Data":"2a997ffe72072ef8c22243d727273db31d3280d49d6e1772718974d71002c1ce"} Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.231110 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd7477fc7-x2lt5" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.231151 4931 scope.go:117] "RemoveContainer" containerID="8e8e3d00dd4510d8c026119eb82ad7d32245c7b05317be8aad83da5e86e754ef" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.231260 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.231538 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68669fb557-g4zdb" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.232603 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.311635 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bd7477fc7-x2lt5"] Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.318380 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-bd7477fc7-x2lt5"] Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.324876 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-ovsdbserver-sb\") pod \"f61f6fb9-c1c7-4feb-a851-711b9f093142\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.324914 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wt5z\" (UniqueName: \"kubernetes.io/projected/f61f6fb9-c1c7-4feb-a851-711b9f093142-kube-api-access-7wt5z\") pod \"f61f6fb9-c1c7-4feb-a851-711b9f093142\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.325035 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-ovsdbserver-nb\") pod \"f61f6fb9-c1c7-4feb-a851-711b9f093142\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.325080 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-config\") pod \"f61f6fb9-c1c7-4feb-a851-711b9f093142\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.325819 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-dns-svc\") pod \"f61f6fb9-c1c7-4feb-a851-711b9f093142\" (UID: \"f61f6fb9-c1c7-4feb-a851-711b9f093142\") " Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.327255 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.332643 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f61f6fb9-c1c7-4feb-a851-711b9f093142-kube-api-access-7wt5z" (OuterVolumeSpecName: "kube-api-access-7wt5z") pod "f61f6fb9-c1c7-4feb-a851-711b9f093142" (UID: "f61f6fb9-c1c7-4feb-a851-711b9f093142"). InnerVolumeSpecName "kube-api-access-7wt5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.345689 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.359916 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:20:30 crc kubenswrapper[4931]: E1201 15:20:30.360267 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerName="init" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.360281 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerName="init" Dec 01 15:20:30 crc kubenswrapper[4931]: E1201 15:20:30.360295 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerName="dnsmasq-dns" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.360301 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerName="dnsmasq-dns" Dec 01 15:20:30 crc kubenswrapper[4931]: E1201 15:20:30.360332 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174578a2-050b-44a8-9f8c-75d918589c78" containerName="glance-httpd" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.360337 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="174578a2-050b-44a8-9f8c-75d918589c78" containerName="glance-httpd" Dec 01 15:20:30 crc kubenswrapper[4931]: E1201 15:20:30.360350 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174578a2-050b-44a8-9f8c-75d918589c78" containerName="glance-log" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.360355 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="174578a2-050b-44a8-9f8c-75d918589c78" containerName="glance-log" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.360535 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="174578a2-050b-44a8-9f8c-75d918589c78" containerName="glance-httpd" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.360544 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="174578a2-050b-44a8-9f8c-75d918589c78" containerName="glance-log" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.360552 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerName="dnsmasq-dns" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.361349 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.367940 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.368147 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.402515 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.413022 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f61f6fb9-c1c7-4feb-a851-711b9f093142" (UID: "f61f6fb9-c1c7-4feb-a851-711b9f093142"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.420075 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f61f6fb9-c1c7-4feb-a851-711b9f093142" (UID: "f61f6fb9-c1c7-4feb-a851-711b9f093142"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.420208 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f61f6fb9-c1c7-4feb-a851-711b9f093142" (UID: "f61f6fb9-c1c7-4feb-a851-711b9f093142"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.423379 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68669fb557-g4zdb"] Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.427750 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.427781 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wt5z\" (UniqueName: \"kubernetes.io/projected/f61f6fb9-c1c7-4feb-a851-711b9f093142-kube-api-access-7wt5z\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.427792 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.427801 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.433784 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-68669fb557-g4zdb"] Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.441070 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-config" (OuterVolumeSpecName: "config") pod "f61f6fb9-c1c7-4feb-a851-711b9f093142" (UID: "f61f6fb9-c1c7-4feb-a851-711b9f093142"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.529869 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.529942 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.529974 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.530034 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.530170 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rwl\" (UniqueName: \"kubernetes.io/projected/3a5867d7-6574-4f95-97c9-f6830600606a-kube-api-access-94rwl\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.530322 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a5867d7-6574-4f95-97c9-f6830600606a-logs\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.530419 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.530440 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a5867d7-6574-4f95-97c9-f6830600606a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.530743 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f61f6fb9-c1c7-4feb-a851-711b9f093142-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.564754 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zvwkt"] Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.572404 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zvwkt"] Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.631717 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.631775 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.631803 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.631842 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.631861 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94rwl\" (UniqueName: \"kubernetes.io/projected/3a5867d7-6574-4f95-97c9-f6830600606a-kube-api-access-94rwl\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.631902 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a5867d7-6574-4f95-97c9-f6830600606a-logs\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.631935 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.631953 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a5867d7-6574-4f95-97c9-f6830600606a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.632047 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.633667 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a5867d7-6574-4f95-97c9-f6830600606a-logs\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.634588 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a5867d7-6574-4f95-97c9-f6830600606a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.639946 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.642281 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.643729 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.644229 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.648586 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rwl\" (UniqueName: \"kubernetes.io/projected/3a5867d7-6574-4f95-97c9-f6830600606a-kube-api-access-94rwl\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.662971 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:20:30 crc kubenswrapper[4931]: I1201 15:20:30.799765 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:31 crc kubenswrapper[4931]: E1201 15:20:31.206699 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 01 15:20:31 crc kubenswrapper[4931]: E1201 15:20:31.206967 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rrdxb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-qsr4v_openstack(a79a9139-dcea-4b3f-83dc-a1715f087ac5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:20:31 crc kubenswrapper[4931]: E1201 15:20:31.208273 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-qsr4v" podUID="a79a9139-dcea-4b3f-83dc-a1715f087ac5" Dec 01 15:20:31 crc kubenswrapper[4931]: E1201 15:20:31.250246 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-qsr4v" podUID="a79a9139-dcea-4b3f-83dc-a1715f087ac5" Dec 01 15:20:31 crc kubenswrapper[4931]: I1201 15:20:31.748932 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zvwkt" podUID="f61f6fb9-c1c7-4feb-a851-711b9f093142" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Dec 01 15:20:32 crc kubenswrapper[4931]: I1201 15:20:32.254812 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1330f183-8546-45e4-8ad8-9ec6eb7affa6" path="/var/lib/kubelet/pods/1330f183-8546-45e4-8ad8-9ec6eb7affa6/volumes" Dec 01 15:20:32 crc kubenswrapper[4931]: I1201 15:20:32.255422 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="174578a2-050b-44a8-9f8c-75d918589c78" path="/var/lib/kubelet/pods/174578a2-050b-44a8-9f8c-75d918589c78/volumes" Dec 01 15:20:32 crc kubenswrapper[4931]: I1201 15:20:32.256279 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19efc2e2-7d7e-455c-b966-525f140a2dad" path="/var/lib/kubelet/pods/19efc2e2-7d7e-455c-b966-525f140a2dad/volumes" Dec 01 15:20:32 crc kubenswrapper[4931]: I1201 15:20:32.256686 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f61f6fb9-c1c7-4feb-a851-711b9f093142" path="/var/lib/kubelet/pods/f61f6fb9-c1c7-4feb-a851-711b9f093142/volumes" Dec 01 15:20:33 crc kubenswrapper[4931]: E1201 15:20:33.297202 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 01 15:20:33 crc kubenswrapper[4931]: E1201 15:20:33.297453 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h69rh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-cmc5n_openstack(658da6f1-ac70-4d83-83ca-f79e69f0979d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:20:33 crc kubenswrapper[4931]: E1201 15:20:33.298663 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-cmc5n" podUID="658da6f1-ac70-4d83-83ca-f79e69f0979d" Dec 01 15:20:33 crc kubenswrapper[4931]: I1201 15:20:33.736617 4931 scope.go:117] "RemoveContainer" containerID="58d78ec23db06d08e250939e3d9d32c542a0768487415f900d795c9ac71e1813" Dec 01 15:20:34 crc kubenswrapper[4931]: I1201 15:20:34.219776 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x5lvn"] Dec 01 15:20:34 crc kubenswrapper[4931]: I1201 15:20:34.228516 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6479b7c68-txrvx"] Dec 01 15:20:34 crc kubenswrapper[4931]: I1201 15:20:34.273519 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6479b7c68-txrvx" event={"ID":"97ed61f3-8ca0-4aee-afae-168398babe70","Type":"ContainerStarted","Data":"5131abc1b85f2d20da4a7776bc6d7fd12f68918cf3a16618e728bc5936ca81fe"} Dec 01 15:20:34 crc kubenswrapper[4931]: I1201 15:20:34.274580 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x5lvn" event={"ID":"04a78a28-a980-4393-bd41-e0f523fccf7e","Type":"ContainerStarted","Data":"f2cbe212a44cf02e6f5fa1977ddf0fa7ae2015e5f122248e9924f7c8eb140bac"} Dec 01 15:20:34 crc kubenswrapper[4931]: E1201 15:20:34.275776 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-cmc5n" podUID="658da6f1-ac70-4d83-83ca-f79e69f0979d" Dec 01 15:20:34 crc kubenswrapper[4931]: I1201 15:20:34.359561 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65c944c654-l6mmj"] Dec 01 15:20:34 crc kubenswrapper[4931]: I1201 15:20:34.540642 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:20:34 crc kubenswrapper[4931]: W1201 15:20:34.541662 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a5867d7_6574_4f95_97c9_f6830600606a.slice/crio-4e4b271046ada8760b61be95ca12599de001ebc652731a37b23a07403e6422f1 WatchSource:0}: Error finding container 4e4b271046ada8760b61be95ca12599de001ebc652731a37b23a07403e6422f1: Status 404 returned error can't find the container with id 4e4b271046ada8760b61be95ca12599de001ebc652731a37b23a07403e6422f1 Dec 01 15:20:35 crc kubenswrapper[4931]: E1201 15:20:35.235518 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified" Dec 01 15:20:35 crc kubenswrapper[4931]: E1201 15:20:35.236263 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-notification-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n579hc7h5b8h9bh67fh54fh585h557h599h5d7hb4h59fh675h59fhd4h55fh654h57dh66h56ch5d7h5d7h67fh585h549h698h685hd6h5cdh65ch59h9bq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-notification-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wtrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/notificationhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6e1dea62-ecc2-401d-b436-acc91fba2d5d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:20:35 crc kubenswrapper[4931]: I1201 15:20:35.294370 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65c944c654-l6mmj" event={"ID":"1a2f9f3b-603b-4004-8e6f-dce5b810785c","Type":"ContainerStarted","Data":"fee0764e136f84fb3211676add1ce278f5e30afc5ca66f9c4354e5f35eaca58e"} Dec 01 15:20:35 crc kubenswrapper[4931]: I1201 15:20:35.297050 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x5lvn" event={"ID":"04a78a28-a980-4393-bd41-e0f523fccf7e","Type":"ContainerStarted","Data":"bff0594af606cbd38e8a481f84acc561c9cabc0f5f3c71429ae70be32efce92d"} Dec 01 15:20:35 crc kubenswrapper[4931]: I1201 15:20:35.298542 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c07769f8-f78c-4248-b590-f916b5362cd3","Type":"ContainerStarted","Data":"f1c1aeab3993384c79c7894c76d031687e6d2a0af240d9454d403e3a56a386ed"} Dec 01 15:20:35 crc kubenswrapper[4931]: I1201 15:20:35.299822 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ffphc" event={"ID":"38f515fe-6925-463c-b5dc-87b23d360ec5","Type":"ContainerStarted","Data":"19cc472e6fe2a229b8b7de1d3e4ad6fbc2d6a34025debd2ed8dd363218082cf4"} Dec 01 15:20:35 crc kubenswrapper[4931]: I1201 15:20:35.328087 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ffphc" podStartSLOduration=9.562545977 podStartE2EDuration="48.328065542s" podCreationTimestamp="2025-12-01 15:19:47 +0000 UTC" firstStartedPulling="2025-12-01 15:19:51.30948044 +0000 UTC m=+1137.735354117" lastFinishedPulling="2025-12-01 15:20:30.075000015 +0000 UTC m=+1176.500873682" observedRunningTime="2025-12-01 15:20:35.323832154 +0000 UTC m=+1181.749705831" watchObservedRunningTime="2025-12-01 15:20:35.328065542 +0000 UTC m=+1181.753939209" Dec 01 15:20:35 crc kubenswrapper[4931]: I1201 15:20:35.329532 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"4e4ed6dba6fb114a6fa001d5a7e833110cfac31acfad0f0aa34153474c4cd322"} Dec 01 15:20:35 crc kubenswrapper[4931]: I1201 15:20:35.333134 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3a5867d7-6574-4f95-97c9-f6830600606a","Type":"ContainerStarted","Data":"689bccfbbada21a31d285e8774317a671d2202169bb277c5a0debcf496e821f7"} Dec 01 15:20:35 crc kubenswrapper[4931]: I1201 15:20:35.333185 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3a5867d7-6574-4f95-97c9-f6830600606a","Type":"ContainerStarted","Data":"4e4b271046ada8760b61be95ca12599de001ebc652731a37b23a07403e6422f1"} Dec 01 15:20:36 crc kubenswrapper[4931]: I1201 15:20:36.355959 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c07769f8-f78c-4248-b590-f916b5362cd3" containerName="glance-log" containerID="cri-o://88c793e28a432110ff33d602ec42af1a1db711efcd7d7958a44ed402b356ce93" gracePeriod=30 Dec 01 15:20:36 crc kubenswrapper[4931]: I1201 15:20:36.356531 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"df0e8af413a0eb952c0f5bb6026a894f861183b06bd656d6b8fee41a505a45cd"} Dec 01 15:20:36 crc kubenswrapper[4931]: I1201 15:20:36.356855 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c07769f8-f78c-4248-b590-f916b5362cd3" containerName="glance-httpd" containerID="cri-o://f1c1aeab3993384c79c7894c76d031687e6d2a0af240d9454d403e3a56a386ed" gracePeriod=30 Dec 01 15:20:36 crc kubenswrapper[4931]: I1201 15:20:36.383157 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=49.383132661 podStartE2EDuration="49.383132661s" podCreationTimestamp="2025-12-01 15:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:20:36.377181434 +0000 UTC m=+1182.803055101" watchObservedRunningTime="2025-12-01 15:20:36.383132661 +0000 UTC m=+1182.809006328" Dec 01 15:20:36 crc kubenswrapper[4931]: I1201 15:20:36.401727 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-x5lvn" podStartSLOduration=21.401706123 podStartE2EDuration="21.401706123s" podCreationTimestamp="2025-12-01 15:20:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:20:36.399361677 +0000 UTC m=+1182.825235334" watchObservedRunningTime="2025-12-01 15:20:36.401706123 +0000 UTC m=+1182.827579800" Dec 01 15:20:37 crc kubenswrapper[4931]: I1201 15:20:37.370853 4931 generic.go:334] "Generic (PLEG): container finished" podID="c07769f8-f78c-4248-b590-f916b5362cd3" containerID="88c793e28a432110ff33d602ec42af1a1db711efcd7d7958a44ed402b356ce93" exitCode=143 Dec 01 15:20:37 crc kubenswrapper[4931]: I1201 15:20:37.371080 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c07769f8-f78c-4248-b590-f916b5362cd3","Type":"ContainerDied","Data":"88c793e28a432110ff33d602ec42af1a1db711efcd7d7958a44ed402b356ce93"} Dec 01 15:20:37 crc kubenswrapper[4931]: I1201 15:20:37.378635 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"22dcdc777388e41a6eef0260c4ce9a92eb22ce889defb5e4c159378213bc189a"} Dec 01 15:20:37 crc kubenswrapper[4931]: I1201 15:20:37.380467 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65c944c654-l6mmj" event={"ID":"1a2f9f3b-603b-4004-8e6f-dce5b810785c","Type":"ContainerStarted","Data":"0f909b53c303e143e1485968e555d92ec30b8dce9c8f31e0091c52d839fb731f"} Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.129096 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.171013 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-scripts\") pod \"c07769f8-f78c-4248-b590-f916b5362cd3\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.171068 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"c07769f8-f78c-4248-b590-f916b5362cd3\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.171102 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c07769f8-f78c-4248-b590-f916b5362cd3-httpd-run\") pod \"c07769f8-f78c-4248-b590-f916b5362cd3\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.171167 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-combined-ca-bundle\") pod \"c07769f8-f78c-4248-b590-f916b5362cd3\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.171261 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-config-data\") pod \"c07769f8-f78c-4248-b590-f916b5362cd3\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.171282 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-public-tls-certs\") pod \"c07769f8-f78c-4248-b590-f916b5362cd3\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.171310 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c07769f8-f78c-4248-b590-f916b5362cd3-logs\") pod \"c07769f8-f78c-4248-b590-f916b5362cd3\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.171369 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nccg8\" (UniqueName: \"kubernetes.io/projected/c07769f8-f78c-4248-b590-f916b5362cd3-kube-api-access-nccg8\") pod \"c07769f8-f78c-4248-b590-f916b5362cd3\" (UID: \"c07769f8-f78c-4248-b590-f916b5362cd3\") " Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.175450 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c07769f8-f78c-4248-b590-f916b5362cd3-logs" (OuterVolumeSpecName: "logs") pod "c07769f8-f78c-4248-b590-f916b5362cd3" (UID: "c07769f8-f78c-4248-b590-f916b5362cd3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.177607 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c07769f8-f78c-4248-b590-f916b5362cd3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c07769f8-f78c-4248-b590-f916b5362cd3" (UID: "c07769f8-f78c-4248-b590-f916b5362cd3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.181006 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c07769f8-f78c-4248-b590-f916b5362cd3-kube-api-access-nccg8" (OuterVolumeSpecName: "kube-api-access-nccg8") pod "c07769f8-f78c-4248-b590-f916b5362cd3" (UID: "c07769f8-f78c-4248-b590-f916b5362cd3"). InnerVolumeSpecName "kube-api-access-nccg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.181946 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "c07769f8-f78c-4248-b590-f916b5362cd3" (UID: "c07769f8-f78c-4248-b590-f916b5362cd3"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.183109 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-scripts" (OuterVolumeSpecName: "scripts") pod "c07769f8-f78c-4248-b590-f916b5362cd3" (UID: "c07769f8-f78c-4248-b590-f916b5362cd3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.200535 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c07769f8-f78c-4248-b590-f916b5362cd3" (UID: "c07769f8-f78c-4248-b590-f916b5362cd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.222749 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c07769f8-f78c-4248-b590-f916b5362cd3" (UID: "c07769f8-f78c-4248-b590-f916b5362cd3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.234516 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-config-data" (OuterVolumeSpecName: "config-data") pod "c07769f8-f78c-4248-b590-f916b5362cd3" (UID: "c07769f8-f78c-4248-b590-f916b5362cd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.273535 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.273573 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.273583 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.273592 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c07769f8-f78c-4248-b590-f916b5362cd3-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.273601 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nccg8\" (UniqueName: \"kubernetes.io/projected/c07769f8-f78c-4248-b590-f916b5362cd3-kube-api-access-nccg8\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.273610 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c07769f8-f78c-4248-b590-f916b5362cd3-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.273631 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.273640 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c07769f8-f78c-4248-b590-f916b5362cd3-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.294154 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.374766 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.391945 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65c944c654-l6mmj" event={"ID":"1a2f9f3b-603b-4004-8e6f-dce5b810785c","Type":"ContainerStarted","Data":"4169ea44ff5a2ac786fe0f96af88c7850c1d78aa935370ebc69fdd888923b9d2"} Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.393738 4931 generic.go:334] "Generic (PLEG): container finished" podID="c07769f8-f78c-4248-b590-f916b5362cd3" containerID="f1c1aeab3993384c79c7894c76d031687e6d2a0af240d9454d403e3a56a386ed" exitCode=0 Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.393876 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.393875 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c07769f8-f78c-4248-b590-f916b5362cd3","Type":"ContainerDied","Data":"f1c1aeab3993384c79c7894c76d031687e6d2a0af240d9454d403e3a56a386ed"} Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.394079 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c07769f8-f78c-4248-b590-f916b5362cd3","Type":"ContainerDied","Data":"79fbb0f34c99bd20a5d2a68dc80e7a756df5ea8a0209a1ff0cc52e59331be07f"} Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.394099 4931 scope.go:117] "RemoveContainer" containerID="f1c1aeab3993384c79c7894c76d031687e6d2a0af240d9454d403e3a56a386ed" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.405325 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"fe036b57-6753-42af-ad39-195f0688532d","Type":"ContainerStarted","Data":"26ac9174d5ca35b1ef7f6371f4549dc9d2d07f62ff2d5a2c3ca238b84f5c6191"} Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.407925 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6479b7c68-txrvx" event={"ID":"97ed61f3-8ca0-4aee-afae-168398babe70","Type":"ContainerStarted","Data":"63a67a4f00ff8d02071cd5c46b0692c9f484b2d61acc41dcf8b2b53a1f51fdb8"} Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.407962 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6479b7c68-txrvx" event={"ID":"97ed61f3-8ca0-4aee-afae-168398babe70","Type":"ContainerStarted","Data":"91144dfad5dcee4bce9e3dffbc29f3988a0e250ee32ee1cfa8b7b6b0c59d19c5"} Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.415177 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-65c944c654-l6mmj" podStartSLOduration=40.96685593 podStartE2EDuration="42.415164394s" podCreationTimestamp="2025-12-01 15:19:56 +0000 UTC" firstStartedPulling="2025-12-01 15:20:34.388797568 +0000 UTC m=+1180.814671235" lastFinishedPulling="2025-12-01 15:20:35.837106032 +0000 UTC m=+1182.262979699" observedRunningTime="2025-12-01 15:20:38.413449496 +0000 UTC m=+1184.839323213" watchObservedRunningTime="2025-12-01 15:20:38.415164394 +0000 UTC m=+1184.841038061" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.419890 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3a5867d7-6574-4f95-97c9-f6830600606a","Type":"ContainerStarted","Data":"50178a8e177ef0ae4ce84e40919d15aee6c2f140b9641179cff854b7979db0d5"} Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.435853 4931 scope.go:117] "RemoveContainer" containerID="88c793e28a432110ff33d602ec42af1a1db711efcd7d7958a44ed402b356ce93" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.444494 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.465505 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.486613 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:20:38 crc kubenswrapper[4931]: E1201 15:20:38.486923 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c07769f8-f78c-4248-b590-f916b5362cd3" containerName="glance-log" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.486935 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c07769f8-f78c-4248-b590-f916b5362cd3" containerName="glance-log" Dec 01 15:20:38 crc kubenswrapper[4931]: E1201 15:20:38.486952 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c07769f8-f78c-4248-b590-f916b5362cd3" containerName="glance-httpd" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.486958 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c07769f8-f78c-4248-b590-f916b5362cd3" containerName="glance-httpd" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.487123 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c07769f8-f78c-4248-b590-f916b5362cd3" containerName="glance-httpd" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.487148 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c07769f8-f78c-4248-b590-f916b5362cd3" containerName="glance-log" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.487938 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.490151 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6479b7c68-txrvx" podStartSLOduration=40.893053646 podStartE2EDuration="42.49013974s" podCreationTimestamp="2025-12-01 15:19:56 +0000 UTC" firstStartedPulling="2025-12-01 15:20:34.239900605 +0000 UTC m=+1180.665774272" lastFinishedPulling="2025-12-01 15:20:35.836986699 +0000 UTC m=+1182.262860366" observedRunningTime="2025-12-01 15:20:38.450315002 +0000 UTC m=+1184.876188669" watchObservedRunningTime="2025-12-01 15:20:38.49013974 +0000 UTC m=+1184.916013407" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.492766 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.493988 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.507538 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.513158 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=80.543228611 podStartE2EDuration="1m32.513140616s" podCreationTimestamp="2025-12-01 15:19:06 +0000 UTC" firstStartedPulling="2025-12-01 15:19:40.293894533 +0000 UTC m=+1126.719768200" lastFinishedPulling="2025-12-01 15:19:52.263806538 +0000 UTC m=+1138.689680205" observedRunningTime="2025-12-01 15:20:38.486345764 +0000 UTC m=+1184.912219431" watchObservedRunningTime="2025-12-01 15:20:38.513140616 +0000 UTC m=+1184.939014283" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.542120 4931 scope.go:117] "RemoveContainer" containerID="f1c1aeab3993384c79c7894c76d031687e6d2a0af240d9454d403e3a56a386ed" Dec 01 15:20:38 crc kubenswrapper[4931]: E1201 15:20:38.542638 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1c1aeab3993384c79c7894c76d031687e6d2a0af240d9454d403e3a56a386ed\": container with ID starting with f1c1aeab3993384c79c7894c76d031687e6d2a0af240d9454d403e3a56a386ed not found: ID does not exist" containerID="f1c1aeab3993384c79c7894c76d031687e6d2a0af240d9454d403e3a56a386ed" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.542668 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c1aeab3993384c79c7894c76d031687e6d2a0af240d9454d403e3a56a386ed"} err="failed to get container status \"f1c1aeab3993384c79c7894c76d031687e6d2a0af240d9454d403e3a56a386ed\": rpc error: code = NotFound desc = could not find container \"f1c1aeab3993384c79c7894c76d031687e6d2a0af240d9454d403e3a56a386ed\": container with ID starting with f1c1aeab3993384c79c7894c76d031687e6d2a0af240d9454d403e3a56a386ed not found: ID does not exist" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.542686 4931 scope.go:117] "RemoveContainer" containerID="88c793e28a432110ff33d602ec42af1a1db711efcd7d7958a44ed402b356ce93" Dec 01 15:20:38 crc kubenswrapper[4931]: E1201 15:20:38.542877 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88c793e28a432110ff33d602ec42af1a1db711efcd7d7958a44ed402b356ce93\": container with ID starting with 88c793e28a432110ff33d602ec42af1a1db711efcd7d7958a44ed402b356ce93 not found: ID does not exist" containerID="88c793e28a432110ff33d602ec42af1a1db711efcd7d7958a44ed402b356ce93" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.542894 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88c793e28a432110ff33d602ec42af1a1db711efcd7d7958a44ed402b356ce93"} err="failed to get container status \"88c793e28a432110ff33d602ec42af1a1db711efcd7d7958a44ed402b356ce93\": rpc error: code = NotFound desc = could not find container \"88c793e28a432110ff33d602ec42af1a1db711efcd7d7958a44ed402b356ce93\": container with ID starting with 88c793e28a432110ff33d602ec42af1a1db711efcd7d7958a44ed402b356ce93 not found: ID does not exist" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.543721 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.543698565 podStartE2EDuration="8.543698565s" podCreationTimestamp="2025-12-01 15:20:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:20:38.517756836 +0000 UTC m=+1184.943630513" watchObservedRunningTime="2025-12-01 15:20:38.543698565 +0000 UTC m=+1184.969572232" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.579417 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.579455 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-logs\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.579511 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.579535 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.579559 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72bcf\" (UniqueName: \"kubernetes.io/projected/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-kube-api-access-72bcf\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.579593 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-config-data\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.579608 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.579635 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-scripts\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.680884 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.680926 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.680954 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72bcf\" (UniqueName: \"kubernetes.io/projected/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-kube-api-access-72bcf\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.680994 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-config-data\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.681011 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.681037 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-scripts\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.681084 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.681101 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-logs\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.681534 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-logs\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.682403 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.682929 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.686564 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.687039 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-scripts\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.687441 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.693811 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-config-data\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.702193 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72bcf\" (UniqueName: \"kubernetes.io/projected/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-kube-api-access-72bcf\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.717112 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.773439 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-2mz7d"] Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.774778 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.778034 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.782441 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.782538 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.782604 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.782669 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.782701 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-config\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.782770 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9j6h\" (UniqueName: \"kubernetes.io/projected/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-kube-api-access-f9j6h\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.798949 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-2mz7d"] Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.883303 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.884518 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.884587 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-config\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.884662 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9j6h\" (UniqueName: \"kubernetes.io/projected/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-kube-api-access-f9j6h\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.884785 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.884901 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.884930 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.885586 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-config\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.885715 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.886115 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.886789 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.888249 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:38 crc kubenswrapper[4931]: I1201 15:20:38.912665 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9j6h\" (UniqueName: \"kubernetes.io/projected/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-kube-api-access-f9j6h\") pod \"dnsmasq-dns-57c957c4ff-2mz7d\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:39 crc kubenswrapper[4931]: I1201 15:20:39.096531 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:39 crc kubenswrapper[4931]: I1201 15:20:39.461738 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:20:39 crc kubenswrapper[4931]: I1201 15:20:39.593858 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-2mz7d"] Dec 01 15:20:40 crc kubenswrapper[4931]: I1201 15:20:40.271232 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c07769f8-f78c-4248-b590-f916b5362cd3" path="/var/lib/kubelet/pods/c07769f8-f78c-4248-b590-f916b5362cd3/volumes" Dec 01 15:20:40 crc kubenswrapper[4931]: I1201 15:20:40.443516 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3d6c071d-fd2a-43e2-a7d2-c2499809aad0","Type":"ContainerStarted","Data":"f5ec4d93245a42904cae955178a61b7393926f8549f4199a5730546bc58a8e52"} Dec 01 15:20:40 crc kubenswrapper[4931]: I1201 15:20:40.445337 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" event={"ID":"8da5b32f-da6b-48c5-8cf3-3f1699e1774b","Type":"ContainerStarted","Data":"f365c2db35cacae74456068495322990007bcb8ff02a92f4f3a04c63857be40d"} Dec 01 15:20:40 crc kubenswrapper[4931]: I1201 15:20:40.447116 4931 generic.go:334] "Generic (PLEG): container finished" podID="04a78a28-a980-4393-bd41-e0f523fccf7e" containerID="bff0594af606cbd38e8a481f84acc561c9cabc0f5f3c71429ae70be32efce92d" exitCode=0 Dec 01 15:20:40 crc kubenswrapper[4931]: I1201 15:20:40.447159 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x5lvn" event={"ID":"04a78a28-a980-4393-bd41-e0f523fccf7e","Type":"ContainerDied","Data":"bff0594af606cbd38e8a481f84acc561c9cabc0f5f3c71429ae70be32efce92d"} Dec 01 15:20:40 crc kubenswrapper[4931]: I1201 15:20:40.800291 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:40 crc kubenswrapper[4931]: I1201 15:20:40.800378 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:40 crc kubenswrapper[4931]: I1201 15:20:40.869553 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:40 crc kubenswrapper[4931]: I1201 15:20:40.872775 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:41 crc kubenswrapper[4931]: I1201 15:20:41.458053 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3d6c071d-fd2a-43e2-a7d2-c2499809aad0","Type":"ContainerStarted","Data":"2ca6ff941a63c9f5cb9c155cddf7cbc90e315a52a2ce845b9fbff5144f8e2f48"} Dec 01 15:20:41 crc kubenswrapper[4931]: I1201 15:20:41.460765 4931 generic.go:334] "Generic (PLEG): container finished" podID="8da5b32f-da6b-48c5-8cf3-3f1699e1774b" containerID="e265b69b915159314ed4ce14cded3338ecf91fe0215ac4f5ec0b072268421a3d" exitCode=0 Dec 01 15:20:41 crc kubenswrapper[4931]: I1201 15:20:41.460819 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" event={"ID":"8da5b32f-da6b-48c5-8cf3-3f1699e1774b","Type":"ContainerDied","Data":"e265b69b915159314ed4ce14cded3338ecf91fe0215ac4f5ec0b072268421a3d"} Dec 01 15:20:41 crc kubenswrapper[4931]: I1201 15:20:41.461406 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:41 crc kubenswrapper[4931]: I1201 15:20:41.461669 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:43 crc kubenswrapper[4931]: I1201 15:20:43.480862 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 15:20:43 crc kubenswrapper[4931]: I1201 15:20:43.514478 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:44 crc kubenswrapper[4931]: I1201 15:20:44.482754 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.369749 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.441460 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-credential-keys\") pod \"04a78a28-a980-4393-bd41-e0f523fccf7e\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.441499 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj87b\" (UniqueName: \"kubernetes.io/projected/04a78a28-a980-4393-bd41-e0f523fccf7e-kube-api-access-sj87b\") pod \"04a78a28-a980-4393-bd41-e0f523fccf7e\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.441566 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-combined-ca-bundle\") pod \"04a78a28-a980-4393-bd41-e0f523fccf7e\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.441652 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-scripts\") pod \"04a78a28-a980-4393-bd41-e0f523fccf7e\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.441717 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-config-data\") pod \"04a78a28-a980-4393-bd41-e0f523fccf7e\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.441753 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-fernet-keys\") pod \"04a78a28-a980-4393-bd41-e0f523fccf7e\" (UID: \"04a78a28-a980-4393-bd41-e0f523fccf7e\") " Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.448224 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a78a28-a980-4393-bd41-e0f523fccf7e-kube-api-access-sj87b" (OuterVolumeSpecName: "kube-api-access-sj87b") pod "04a78a28-a980-4393-bd41-e0f523fccf7e" (UID: "04a78a28-a980-4393-bd41-e0f523fccf7e"). InnerVolumeSpecName "kube-api-access-sj87b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.448522 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "04a78a28-a980-4393-bd41-e0f523fccf7e" (UID: "04a78a28-a980-4393-bd41-e0f523fccf7e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.451102 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "04a78a28-a980-4393-bd41-e0f523fccf7e" (UID: "04a78a28-a980-4393-bd41-e0f523fccf7e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.451591 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-scripts" (OuterVolumeSpecName: "scripts") pod "04a78a28-a980-4393-bd41-e0f523fccf7e" (UID: "04a78a28-a980-4393-bd41-e0f523fccf7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.514525 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04a78a28-a980-4393-bd41-e0f523fccf7e" (UID: "04a78a28-a980-4393-bd41-e0f523fccf7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.517242 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1dea62-ecc2-401d-b436-acc91fba2d5d","Type":"ContainerStarted","Data":"fa62247771a34f6f82920402fc848c1829a59ea972f86b85cba824d42267259b"} Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.519161 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x5lvn" event={"ID":"04a78a28-a980-4393-bd41-e0f523fccf7e","Type":"ContainerDied","Data":"f2cbe212a44cf02e6f5fa1977ddf0fa7ae2015e5f122248e9924f7c8eb140bac"} Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.519191 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2cbe212a44cf02e6f5fa1977ddf0fa7ae2015e5f122248e9924f7c8eb140bac" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.519259 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x5lvn" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.526797 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-config-data" (OuterVolumeSpecName: "config-data") pod "04a78a28-a980-4393-bd41-e0f523fccf7e" (UID: "04a78a28-a980-4393-bd41-e0f523fccf7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.543554 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.543808 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.543911 4931 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.543988 4931 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.544060 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj87b\" (UniqueName: \"kubernetes.io/projected/04a78a28-a980-4393-bd41-e0f523fccf7e-kube-api-access-sj87b\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.545171 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04a78a28-a980-4393-bd41-e0f523fccf7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.601402 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.602104 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.674613 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:20:46 crc kubenswrapper[4931]: I1201 15:20:46.674982 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.530546 4931 generic.go:334] "Generic (PLEG): container finished" podID="38f515fe-6925-463c-b5dc-87b23d360ec5" containerID="19cc472e6fe2a229b8b7de1d3e4ad6fbc2d6a34025debd2ed8dd363218082cf4" exitCode=0 Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.530749 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ffphc" event={"ID":"38f515fe-6925-463c-b5dc-87b23d360ec5","Type":"ContainerDied","Data":"19cc472e6fe2a229b8b7de1d3e4ad6fbc2d6a34025debd2ed8dd363218082cf4"} Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.538706 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3d6c071d-fd2a-43e2-a7d2-c2499809aad0","Type":"ContainerStarted","Data":"b27349e697d30399f8075c2cb003d90367ae0464a5d4d5c19e50e2438314d4c5"} Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.565791 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" event={"ID":"8da5b32f-da6b-48c5-8cf3-3f1699e1774b","Type":"ContainerStarted","Data":"a82bc2220298726e60cfbf6331b995975602f5be1a0173e3ac1a2586d86e4ef2"} Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.575492 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.580584 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qsr4v" event={"ID":"a79a9139-dcea-4b3f-83dc-a1715f087ac5","Type":"ContainerStarted","Data":"34c8c9011e6fd28dfc61e7a80b19ed99873729e8f31c22ff568a0cdb447142e7"} Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.600956 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.600940259 podStartE2EDuration="9.600940259s" podCreationTimestamp="2025-12-01 15:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:20:47.595784334 +0000 UTC m=+1194.021658001" watchObservedRunningTime="2025-12-01 15:20:47.600940259 +0000 UTC m=+1194.026813926" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.662602 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" podStartSLOduration=9.66258406 podStartE2EDuration="9.66258406s" podCreationTimestamp="2025-12-01 15:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:20:47.639708628 +0000 UTC m=+1194.065582295" watchObservedRunningTime="2025-12-01 15:20:47.66258406 +0000 UTC m=+1194.088457727" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.684795 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-qsr4v" podStartSLOduration=6.744932844 podStartE2EDuration="1m1.684778894s" podCreationTimestamp="2025-12-01 15:19:46 +0000 UTC" firstStartedPulling="2025-12-01 15:19:51.306969659 +0000 UTC m=+1137.732843326" lastFinishedPulling="2025-12-01 15:20:46.246815709 +0000 UTC m=+1192.672689376" observedRunningTime="2025-12-01 15:20:47.682120639 +0000 UTC m=+1194.107994306" watchObservedRunningTime="2025-12-01 15:20:47.684778894 +0000 UTC m=+1194.110652561" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.696778 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7964d85c7c-w2fwr"] Dec 01 15:20:47 crc kubenswrapper[4931]: E1201 15:20:47.697171 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a78a28-a980-4393-bd41-e0f523fccf7e" containerName="keystone-bootstrap" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.697188 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a78a28-a980-4393-bd41-e0f523fccf7e" containerName="keystone-bootstrap" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.697401 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a78a28-a980-4393-bd41-e0f523fccf7e" containerName="keystone-bootstrap" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.697990 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.700361 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.701019 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w5fnq" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.703631 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.704093 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.704857 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.705043 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.716288 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7964d85c7c-w2fwr"] Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.871918 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-internal-tls-certs\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.871988 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-credential-keys\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.872011 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-config-data\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.872030 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msvkp\" (UniqueName: \"kubernetes.io/projected/407f9337-3fab-42cc-b10d-eada296e7919-kube-api-access-msvkp\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.872055 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-public-tls-certs\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.872333 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-fernet-keys\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.872476 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-scripts\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.872538 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-combined-ca-bundle\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.974456 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-credential-keys\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.974528 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-config-data\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.974559 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msvkp\" (UniqueName: \"kubernetes.io/projected/407f9337-3fab-42cc-b10d-eada296e7919-kube-api-access-msvkp\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.974603 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-public-tls-certs\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.974640 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-fernet-keys\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.974683 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-scripts\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.974709 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-combined-ca-bundle\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.974818 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-internal-tls-certs\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.981136 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-public-tls-certs\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.980974 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-credential-keys\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.981962 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-scripts\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.982350 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-config-data\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.983208 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-combined-ca-bundle\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.984005 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-internal-tls-certs\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.984815 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/407f9337-3fab-42cc-b10d-eada296e7919-fernet-keys\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:47 crc kubenswrapper[4931]: I1201 15:20:47.996828 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msvkp\" (UniqueName: \"kubernetes.io/projected/407f9337-3fab-42cc-b10d-eada296e7919-kube-api-access-msvkp\") pod \"keystone-7964d85c7c-w2fwr\" (UID: \"407f9337-3fab-42cc-b10d-eada296e7919\") " pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:48 crc kubenswrapper[4931]: I1201 15:20:48.025695 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:48 crc kubenswrapper[4931]: I1201 15:20:48.514853 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7964d85c7c-w2fwr"] Dec 01 15:20:48 crc kubenswrapper[4931]: W1201 15:20:48.535735 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod407f9337_3fab_42cc_b10d_eada296e7919.slice/crio-704204fe5dc39202387041fd94b586031601cb0f2e4b980a241d144075d5a71d WatchSource:0}: Error finding container 704204fe5dc39202387041fd94b586031601cb0f2e4b980a241d144075d5a71d: Status 404 returned error can't find the container with id 704204fe5dc39202387041fd94b586031601cb0f2e4b980a241d144075d5a71d Dec 01 15:20:48 crc kubenswrapper[4931]: I1201 15:20:48.590983 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7964d85c7c-w2fwr" event={"ID":"407f9337-3fab-42cc-b10d-eada296e7919","Type":"ContainerStarted","Data":"704204fe5dc39202387041fd94b586031601cb0f2e4b980a241d144075d5a71d"} Dec 01 15:20:48 crc kubenswrapper[4931]: I1201 15:20:48.884653 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 15:20:48 crc kubenswrapper[4931]: I1201 15:20:48.884993 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 15:20:48 crc kubenswrapper[4931]: I1201 15:20:48.918327 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ffphc" Dec 01 15:20:48 crc kubenswrapper[4931]: I1201 15:20:48.953006 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 15:20:48 crc kubenswrapper[4931]: I1201 15:20:48.964543 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.007350 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjxkw\" (UniqueName: \"kubernetes.io/projected/38f515fe-6925-463c-b5dc-87b23d360ec5-kube-api-access-xjxkw\") pod \"38f515fe-6925-463c-b5dc-87b23d360ec5\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.007414 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38f515fe-6925-463c-b5dc-87b23d360ec5-logs\") pod \"38f515fe-6925-463c-b5dc-87b23d360ec5\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.007439 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-config-data\") pod \"38f515fe-6925-463c-b5dc-87b23d360ec5\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.007598 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-combined-ca-bundle\") pod \"38f515fe-6925-463c-b5dc-87b23d360ec5\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.007640 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-scripts\") pod \"38f515fe-6925-463c-b5dc-87b23d360ec5\" (UID: \"38f515fe-6925-463c-b5dc-87b23d360ec5\") " Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.007887 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38f515fe-6925-463c-b5dc-87b23d360ec5-logs" (OuterVolumeSpecName: "logs") pod "38f515fe-6925-463c-b5dc-87b23d360ec5" (UID: "38f515fe-6925-463c-b5dc-87b23d360ec5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.012132 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f515fe-6925-463c-b5dc-87b23d360ec5-kube-api-access-xjxkw" (OuterVolumeSpecName: "kube-api-access-xjxkw") pod "38f515fe-6925-463c-b5dc-87b23d360ec5" (UID: "38f515fe-6925-463c-b5dc-87b23d360ec5"). InnerVolumeSpecName "kube-api-access-xjxkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.017885 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-scripts" (OuterVolumeSpecName: "scripts") pod "38f515fe-6925-463c-b5dc-87b23d360ec5" (UID: "38f515fe-6925-463c-b5dc-87b23d360ec5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.041526 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38f515fe-6925-463c-b5dc-87b23d360ec5" (UID: "38f515fe-6925-463c-b5dc-87b23d360ec5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.043755 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-config-data" (OuterVolumeSpecName: "config-data") pod "38f515fe-6925-463c-b5dc-87b23d360ec5" (UID: "38f515fe-6925-463c-b5dc-87b23d360ec5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.109900 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.109949 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.109961 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjxkw\" (UniqueName: \"kubernetes.io/projected/38f515fe-6925-463c-b5dc-87b23d360ec5-kube-api-access-xjxkw\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.109976 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38f515fe-6925-463c-b5dc-87b23d360ec5-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.109985 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f515fe-6925-463c-b5dc-87b23d360ec5-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.606249 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ffphc" event={"ID":"38f515fe-6925-463c-b5dc-87b23d360ec5","Type":"ContainerDied","Data":"92e40620aea4ac001f62cf6b285b3e1cb5403fd0c7c98cb361043b628cd588ed"} Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.606526 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92e40620aea4ac001f62cf6b285b3e1cb5403fd0c7c98cb361043b628cd588ed" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.606327 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ffphc" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.618235 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7964d85c7c-w2fwr" event={"ID":"407f9337-3fab-42cc-b10d-eada296e7919","Type":"ContainerStarted","Data":"98a472c1e788634bc7bc05980b5867fd64e4781bdb8667ee5ffcb522329e2978"} Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.619051 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.619193 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.619824 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.666734 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6c9d84d99b-fj4vs"] Dec 01 15:20:49 crc kubenswrapper[4931]: E1201 15:20:49.667132 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f515fe-6925-463c-b5dc-87b23d360ec5" containerName="placement-db-sync" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.667144 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f515fe-6925-463c-b5dc-87b23d360ec5" containerName="placement-db-sync" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.667304 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f515fe-6925-463c-b5dc-87b23d360ec5" containerName="placement-db-sync" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.668195 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.671856 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.672011 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.672182 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-bcrdg" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.672280 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.676098 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.680430 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7964d85c7c-w2fwr" podStartSLOduration=2.680410964 podStartE2EDuration="2.680410964s" podCreationTimestamp="2025-12-01 15:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:20:49.647428978 +0000 UTC m=+1196.073302655" watchObservedRunningTime="2025-12-01 15:20:49.680410964 +0000 UTC m=+1196.106284631" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.694477 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6c9d84d99b-fj4vs"] Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.835790 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-scripts\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.835855 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-combined-ca-bundle\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.835961 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-internal-tls-certs\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.836044 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-logs\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.836065 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-config-data\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.836090 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-public-tls-certs\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.836136 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcmr9\" (UniqueName: \"kubernetes.io/projected/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-kube-api-access-hcmr9\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.872098 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.872169 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.937581 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcmr9\" (UniqueName: \"kubernetes.io/projected/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-kube-api-access-hcmr9\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.937685 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-scripts\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.937708 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-combined-ca-bundle\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.937750 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-internal-tls-certs\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.937811 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-logs\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.937838 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-config-data\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.937862 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-public-tls-certs\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.938960 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-logs\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.943750 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-internal-tls-certs\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.947090 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-config-data\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.950898 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-combined-ca-bundle\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.951352 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-public-tls-certs\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.959730 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-scripts\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.961195 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcmr9\" (UniqueName: \"kubernetes.io/projected/ae11272a-fb06-4ea3-8ab4-64a667d9cdd9-kube-api-access-hcmr9\") pod \"placement-6c9d84d99b-fj4vs\" (UID: \"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9\") " pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:49 crc kubenswrapper[4931]: I1201 15:20:49.985012 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:50 crc kubenswrapper[4931]: I1201 15:20:50.533180 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6c9d84d99b-fj4vs"] Dec 01 15:20:50 crc kubenswrapper[4931]: I1201 15:20:50.650407 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cmc5n" event={"ID":"658da6f1-ac70-4d83-83ca-f79e69f0979d","Type":"ContainerStarted","Data":"04b2d3391b567348655c3775a2e83ca33406614fb846da43e8e1f65cb26f4da3"} Dec 01 15:20:50 crc kubenswrapper[4931]: I1201 15:20:50.657019 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c9d84d99b-fj4vs" event={"ID":"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9","Type":"ContainerStarted","Data":"ddc9775437bd967f7df16d74f469a31973fead368ca377c203551fbc53217c43"} Dec 01 15:20:50 crc kubenswrapper[4931]: I1201 15:20:50.661922 4931 generic.go:334] "Generic (PLEG): container finished" podID="f9ada3f9-8074-48c1-a190-d6535e26e14f" containerID="62e26de25f4e45ec1dd95fc4d57b3fd2474c0eb0b45e67d57e026a25c974903c" exitCode=0 Dec 01 15:20:50 crc kubenswrapper[4931]: I1201 15:20:50.662061 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9qzq8" event={"ID":"f9ada3f9-8074-48c1-a190-d6535e26e14f","Type":"ContainerDied","Data":"62e26de25f4e45ec1dd95fc4d57b3fd2474c0eb0b45e67d57e026a25c974903c"} Dec 01 15:20:50 crc kubenswrapper[4931]: I1201 15:20:50.667226 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-cmc5n" podStartSLOduration=5.902245521 podStartE2EDuration="1m4.667205075s" podCreationTimestamp="2025-12-01 15:19:46 +0000 UTC" firstStartedPulling="2025-12-01 15:19:50.124754699 +0000 UTC m=+1136.550628366" lastFinishedPulling="2025-12-01 15:20:48.889714253 +0000 UTC m=+1195.315587920" observedRunningTime="2025-12-01 15:20:50.665718513 +0000 UTC m=+1197.091592180" watchObservedRunningTime="2025-12-01 15:20:50.667205075 +0000 UTC m=+1197.093078752" Dec 01 15:20:51 crc kubenswrapper[4931]: I1201 15:20:51.679320 4931 generic.go:334] "Generic (PLEG): container finished" podID="a79a9139-dcea-4b3f-83dc-a1715f087ac5" containerID="34c8c9011e6fd28dfc61e7a80b19ed99873729e8f31c22ff568a0cdb447142e7" exitCode=0 Dec 01 15:20:51 crc kubenswrapper[4931]: I1201 15:20:51.679936 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qsr4v" event={"ID":"a79a9139-dcea-4b3f-83dc-a1715f087ac5","Type":"ContainerDied","Data":"34c8c9011e6fd28dfc61e7a80b19ed99873729e8f31c22ff568a0cdb447142e7"} Dec 01 15:20:51 crc kubenswrapper[4931]: I1201 15:20:51.691331 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 15:20:51 crc kubenswrapper[4931]: I1201 15:20:51.691629 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c9d84d99b-fj4vs" event={"ID":"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9","Type":"ContainerStarted","Data":"c0e038e897dd5fb35bdbe538772fc1c9f7fd558d2c5d3c05a783d2052decbf9f"} Dec 01 15:20:51 crc kubenswrapper[4931]: I1201 15:20:51.691691 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6c9d84d99b-fj4vs" event={"ID":"ae11272a-fb06-4ea3-8ab4-64a667d9cdd9","Type":"ContainerStarted","Data":"bca9411c2c5e41d6e2774c679f4d568a6f6178bea434fd5b03db79055f5fa0a1"} Dec 01 15:20:51 crc kubenswrapper[4931]: I1201 15:20:51.691709 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:51 crc kubenswrapper[4931]: I1201 15:20:51.691722 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:20:51 crc kubenswrapper[4931]: I1201 15:20:51.728994 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6c9d84d99b-fj4vs" podStartSLOduration=2.728971302 podStartE2EDuration="2.728971302s" podCreationTimestamp="2025-12-01 15:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:20:51.718178039 +0000 UTC m=+1198.144051716" watchObservedRunningTime="2025-12-01 15:20:51.728971302 +0000 UTC m=+1198.154844969" Dec 01 15:20:51 crc kubenswrapper[4931]: I1201 15:20:51.913501 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 15:20:53 crc kubenswrapper[4931]: I1201 15:20:53.688412 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 15:20:54 crc kubenswrapper[4931]: I1201 15:20:54.098571 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:20:54 crc kubenswrapper[4931]: I1201 15:20:54.156625 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-6q6f9"] Dec 01 15:20:54 crc kubenswrapper[4931]: I1201 15:20:54.156955 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56798b757f-6q6f9" podUID="99c8e9db-f582-408e-a3ba-a0b583c3218c" containerName="dnsmasq-dns" containerID="cri-o://4bc915050d6b0979096f7179f32d5769ae62fc6d2cb5cf43b1cb63e4bc62ad8e" gracePeriod=10 Dec 01 15:20:54 crc kubenswrapper[4931]: I1201 15:20:54.747887 4931 generic.go:334] "Generic (PLEG): container finished" podID="99c8e9db-f582-408e-a3ba-a0b583c3218c" containerID="4bc915050d6b0979096f7179f32d5769ae62fc6d2cb5cf43b1cb63e4bc62ad8e" exitCode=0 Dec 01 15:20:54 crc kubenswrapper[4931]: I1201 15:20:54.747988 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-6q6f9" event={"ID":"99c8e9db-f582-408e-a3ba-a0b583c3218c","Type":"ContainerDied","Data":"4bc915050d6b0979096f7179f32d5769ae62fc6d2cb5cf43b1cb63e4bc62ad8e"} Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.693503 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9qzq8" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.717317 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qsr4v" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.758688 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qsr4v" event={"ID":"a79a9139-dcea-4b3f-83dc-a1715f087ac5","Type":"ContainerDied","Data":"1cbe00203cd76d9bdbf62f2370529250bc82e5aee7b57036d781006a8e0a95f2"} Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.758732 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cbe00203cd76d9bdbf62f2370529250bc82e5aee7b57036d781006a8e0a95f2" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.758732 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qsr4v" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.760201 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9ada3f9-8074-48c1-a190-d6535e26e14f-config\") pod \"f9ada3f9-8074-48c1-a190-d6535e26e14f\" (UID: \"f9ada3f9-8074-48c1-a190-d6535e26e14f\") " Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.760288 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79a9139-dcea-4b3f-83dc-a1715f087ac5-combined-ca-bundle\") pod \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\" (UID: \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\") " Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.760332 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ada3f9-8074-48c1-a190-d6535e26e14f-combined-ca-bundle\") pod \"f9ada3f9-8074-48c1-a190-d6535e26e14f\" (UID: \"f9ada3f9-8074-48c1-a190-d6535e26e14f\") " Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.760374 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kr7x\" (UniqueName: \"kubernetes.io/projected/f9ada3f9-8074-48c1-a190-d6535e26e14f-kube-api-access-5kr7x\") pod \"f9ada3f9-8074-48c1-a190-d6535e26e14f\" (UID: \"f9ada3f9-8074-48c1-a190-d6535e26e14f\") " Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.760545 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79a9139-dcea-4b3f-83dc-a1715f087ac5-db-sync-config-data\") pod \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\" (UID: \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\") " Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.760566 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrdxb\" (UniqueName: \"kubernetes.io/projected/a79a9139-dcea-4b3f-83dc-a1715f087ac5-kube-api-access-rrdxb\") pod \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\" (UID: \"a79a9139-dcea-4b3f-83dc-a1715f087ac5\") " Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.774855 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ada3f9-8074-48c1-a190-d6535e26e14f-kube-api-access-5kr7x" (OuterVolumeSpecName: "kube-api-access-5kr7x") pod "f9ada3f9-8074-48c1-a190-d6535e26e14f" (UID: "f9ada3f9-8074-48c1-a190-d6535e26e14f"). InnerVolumeSpecName "kube-api-access-5kr7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.775546 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79a9139-dcea-4b3f-83dc-a1715f087ac5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a79a9139-dcea-4b3f-83dc-a1715f087ac5" (UID: "a79a9139-dcea-4b3f-83dc-a1715f087ac5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.775814 4931 generic.go:334] "Generic (PLEG): container finished" podID="658da6f1-ac70-4d83-83ca-f79e69f0979d" containerID="04b2d3391b567348655c3775a2e83ca33406614fb846da43e8e1f65cb26f4da3" exitCode=0 Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.775915 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cmc5n" event={"ID":"658da6f1-ac70-4d83-83ca-f79e69f0979d","Type":"ContainerDied","Data":"04b2d3391b567348655c3775a2e83ca33406614fb846da43e8e1f65cb26f4da3"} Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.776141 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79a9139-dcea-4b3f-83dc-a1715f087ac5-kube-api-access-rrdxb" (OuterVolumeSpecName: "kube-api-access-rrdxb") pod "a79a9139-dcea-4b3f-83dc-a1715f087ac5" (UID: "a79a9139-dcea-4b3f-83dc-a1715f087ac5"). InnerVolumeSpecName "kube-api-access-rrdxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.788294 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9qzq8" event={"ID":"f9ada3f9-8074-48c1-a190-d6535e26e14f","Type":"ContainerDied","Data":"b49186fa60654394f63e110187c1f0b6e4827072d171a189dfc5cb43a545cf85"} Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.788663 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b49186fa60654394f63e110187c1f0b6e4827072d171a189dfc5cb43a545cf85" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.788744 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9qzq8" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.806923 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9ada3f9-8074-48c1-a190-d6535e26e14f-config" (OuterVolumeSpecName: "config") pod "f9ada3f9-8074-48c1-a190-d6535e26e14f" (UID: "f9ada3f9-8074-48c1-a190-d6535e26e14f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.807638 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79a9139-dcea-4b3f-83dc-a1715f087ac5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a79a9139-dcea-4b3f-83dc-a1715f087ac5" (UID: "a79a9139-dcea-4b3f-83dc-a1715f087ac5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.807807 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9ada3f9-8074-48c1-a190-d6535e26e14f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9ada3f9-8074-48c1-a190-d6535e26e14f" (UID: "f9ada3f9-8074-48c1-a190-d6535e26e14f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.862906 4931 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79a9139-dcea-4b3f-83dc-a1715f087ac5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.862947 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrdxb\" (UniqueName: \"kubernetes.io/projected/a79a9139-dcea-4b3f-83dc-a1715f087ac5-kube-api-access-rrdxb\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.862964 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9ada3f9-8074-48c1-a190-d6535e26e14f-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.862978 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79a9139-dcea-4b3f-83dc-a1715f087ac5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.862990 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ada3f9-8074-48c1-a190-d6535e26e14f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:55 crc kubenswrapper[4931]: I1201 15:20:55.863000 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kr7x\" (UniqueName: \"kubernetes.io/projected/f9ada3f9-8074-48c1-a190-d6535e26e14f-kube-api-access-5kr7x\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.284337 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.371123 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh4td\" (UniqueName: \"kubernetes.io/projected/99c8e9db-f582-408e-a3ba-a0b583c3218c-kube-api-access-bh4td\") pod \"99c8e9db-f582-408e-a3ba-a0b583c3218c\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.371200 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-ovsdbserver-sb\") pod \"99c8e9db-f582-408e-a3ba-a0b583c3218c\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.371237 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-dns-svc\") pod \"99c8e9db-f582-408e-a3ba-a0b583c3218c\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.371307 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-ovsdbserver-nb\") pod \"99c8e9db-f582-408e-a3ba-a0b583c3218c\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.371452 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-config\") pod \"99c8e9db-f582-408e-a3ba-a0b583c3218c\" (UID: \"99c8e9db-f582-408e-a3ba-a0b583c3218c\") " Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.375310 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c8e9db-f582-408e-a3ba-a0b583c3218c-kube-api-access-bh4td" (OuterVolumeSpecName: "kube-api-access-bh4td") pod "99c8e9db-f582-408e-a3ba-a0b583c3218c" (UID: "99c8e9db-f582-408e-a3ba-a0b583c3218c"). InnerVolumeSpecName "kube-api-access-bh4td". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.420403 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99c8e9db-f582-408e-a3ba-a0b583c3218c" (UID: "99c8e9db-f582-408e-a3ba-a0b583c3218c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.421274 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99c8e9db-f582-408e-a3ba-a0b583c3218c" (UID: "99c8e9db-f582-408e-a3ba-a0b583c3218c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.423448 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-config" (OuterVolumeSpecName: "config") pod "99c8e9db-f582-408e-a3ba-a0b583c3218c" (UID: "99c8e9db-f582-408e-a3ba-a0b583c3218c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.425207 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99c8e9db-f582-408e-a3ba-a0b583c3218c" (UID: "99c8e9db-f582-408e-a3ba-a0b583c3218c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:56 crc kubenswrapper[4931]: E1201 15:20:56.470684 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="6e1dea62-ecc2-401d-b436-acc91fba2d5d" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.474057 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.474086 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.474098 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh4td\" (UniqueName: \"kubernetes.io/projected/99c8e9db-f582-408e-a3ba-a0b583c3218c-kube-api-access-bh4td\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.474110 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.474118 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99c8e9db-f582-408e-a3ba-a0b583c3218c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.602211 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6479b7c68-txrvx" podUID="97ed61f3-8ca0-4aee-afae-168398babe70" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.679549 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-65c944c654-l6mmj" podUID="1a2f9f3b-603b-4004-8e6f-dce5b810785c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.798770 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-6q6f9" event={"ID":"99c8e9db-f582-408e-a3ba-a0b583c3218c","Type":"ContainerDied","Data":"63bb888d4f64e8c6f3e13a3052f20ce010e0ba291daa7243c451e7faa1e137d8"} Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.799110 4931 scope.go:117] "RemoveContainer" containerID="4bc915050d6b0979096f7179f32d5769ae62fc6d2cb5cf43b1cb63e4bc62ad8e" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.799051 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-6q6f9" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.805093 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1dea62-ecc2-401d-b436-acc91fba2d5d","Type":"ContainerStarted","Data":"0a878f0f4a2ce600e2d24d751569730947d20108a8b4d038586c5e6ec3ecd425"} Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.805196 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e1dea62-ecc2-401d-b436-acc91fba2d5d" containerName="sg-core" containerID="cri-o://fa62247771a34f6f82920402fc848c1829a59ea972f86b85cba824d42267259b" gracePeriod=30 Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.805321 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e1dea62-ecc2-401d-b436-acc91fba2d5d" containerName="proxy-httpd" containerID="cri-o://0a878f0f4a2ce600e2d24d751569730947d20108a8b4d038586c5e6ec3ecd425" gracePeriod=30 Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.890827 4931 scope.go:117] "RemoveContainer" containerID="a3aeec915025240539a85f2ce1224ff52450238dbd0a3b5faba20328794f396b" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.897830 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-kjjl9"] Dec 01 15:20:56 crc kubenswrapper[4931]: E1201 15:20:56.898285 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c8e9db-f582-408e-a3ba-a0b583c3218c" containerName="dnsmasq-dns" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.898302 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c8e9db-f582-408e-a3ba-a0b583c3218c" containerName="dnsmasq-dns" Dec 01 15:20:56 crc kubenswrapper[4931]: E1201 15:20:56.898326 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c8e9db-f582-408e-a3ba-a0b583c3218c" containerName="init" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.898335 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c8e9db-f582-408e-a3ba-a0b583c3218c" containerName="init" Dec 01 15:20:56 crc kubenswrapper[4931]: E1201 15:20:56.898359 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ada3f9-8074-48c1-a190-d6535e26e14f" containerName="neutron-db-sync" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.898367 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ada3f9-8074-48c1-a190-d6535e26e14f" containerName="neutron-db-sync" Dec 01 15:20:56 crc kubenswrapper[4931]: E1201 15:20:56.898397 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79a9139-dcea-4b3f-83dc-a1715f087ac5" containerName="barbican-db-sync" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.898405 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79a9139-dcea-4b3f-83dc-a1715f087ac5" containerName="barbican-db-sync" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.898647 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ada3f9-8074-48c1-a190-d6535e26e14f" containerName="neutron-db-sync" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.898676 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79a9139-dcea-4b3f-83dc-a1715f087ac5" containerName="barbican-db-sync" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.898684 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c8e9db-f582-408e-a3ba-a0b583c3218c" containerName="dnsmasq-dns" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.899995 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.921999 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-6q6f9"] Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.939674 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-6q6f9"] Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.957927 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-kjjl9"] Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.986193 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28brr\" (UniqueName: \"kubernetes.io/projected/fd10f983-b891-48d0-9537-24db2dcef8e0-kube-api-access-28brr\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.986252 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.986288 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.986412 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-config\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.986454 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.986489 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:56 crc kubenswrapper[4931]: I1201 15:20:56.995900 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6765f56b8d-pn88t"] Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.000512 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.008433 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rtw29" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.008593 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.008742 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.017125 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6765f56b8d-pn88t"] Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.059718 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-76d6fd8967-rrlqd"] Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.061590 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.103349 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-kjjl9"] Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.103787 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 01 15:20:57 crc kubenswrapper[4931]: E1201 15:20:57.105548 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-28brr ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" podUID="fd10f983-b891-48d0-9537-24db2dcef8e0" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.111317 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-logs\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.111509 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28brr\" (UniqueName: \"kubernetes.io/projected/fd10f983-b891-48d0-9537-24db2dcef8e0-kube-api-access-28brr\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.111546 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.111595 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.111692 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kl9x\" (UniqueName: \"kubernetes.io/projected/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-kube-api-access-9kl9x\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.111752 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-config-data\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.111774 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-combined-ca-bundle\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.111870 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-config\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.111933 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.111989 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.112017 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-config-data-custom\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.117113 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.118014 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.119015 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.121571 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-config\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.133655 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.154396 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76d6fd8967-rrlqd"] Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.165373 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28brr\" (UniqueName: \"kubernetes.io/projected/fd10f983-b891-48d0-9537-24db2dcef8e0-kube-api-access-28brr\") pod \"dnsmasq-dns-5ccc5c4795-kjjl9\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.215718 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-config-data-custom\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.215793 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-logs\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.215826 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-config-data-custom\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.215858 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd8t5\" (UniqueName: \"kubernetes.io/projected/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-kube-api-access-hd8t5\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.215893 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-combined-ca-bundle\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.215927 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-logs\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.216017 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kl9x\" (UniqueName: \"kubernetes.io/projected/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-kube-api-access-9kl9x\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.216050 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-config-data\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.216074 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-combined-ca-bundle\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.216122 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-config-data\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.217151 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-logs\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.228023 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-dxnfx"] Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.235707 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-config-data\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.244045 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kl9x\" (UniqueName: \"kubernetes.io/projected/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-kube-api-access-9kl9x\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.244600 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-config-data-custom\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.246352 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a83efe-a056-4531-b9f3-c6c4f87a9cdb-combined-ca-bundle\") pod \"barbican-keystone-listener-6765f56b8d-pn88t\" (UID: \"05a83efe-a056-4531-b9f3-c6c4f87a9cdb\") " pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.248040 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.274457 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6cf859c4fb-g2pzn"] Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.279707 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.295357 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.295621 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-x78jr" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.295742 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.296483 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.311442 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-dxnfx"] Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.317183 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjfhc\" (UniqueName: \"kubernetes.io/projected/5040c26c-bf75-4f8b-9ddd-5f5774467afb-kube-api-access-qjfhc\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.317468 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-config-data\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.317546 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.317630 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-config\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.317728 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.317796 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-config-data-custom\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.317874 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd8t5\" (UniqueName: \"kubernetes.io/projected/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-kube-api-access-hd8t5\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.317945 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-combined-ca-bundle\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.318016 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-logs\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.318118 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-dns-svc\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.318188 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.322926 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-logs\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.328637 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-combined-ca-bundle\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.329128 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-config-data-custom\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.356301 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-config-data\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.371992 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd8t5\" (UniqueName: \"kubernetes.io/projected/4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5-kube-api-access-hd8t5\") pod \"barbican-worker-76d6fd8967-rrlqd\" (UID: \"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5\") " pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.396761 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6cf859c4fb-g2pzn"] Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.411081 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.419740 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-ovndb-tls-certs\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.420002 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-dns-svc\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.420091 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.420162 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-httpd-config\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.420229 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjfhc\" (UniqueName: \"kubernetes.io/projected/5040c26c-bf75-4f8b-9ddd-5f5774467afb-kube-api-access-qjfhc\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.420310 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-config\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.420429 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.420511 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-config\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.420695 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.420758 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-combined-ca-bundle\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.420837 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k9cl\" (UniqueName: \"kubernetes.io/projected/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-kube-api-access-5k9cl\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.421902 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-dns-svc\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.422247 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.422885 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-config\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.423026 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.423042 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-969f6549d-4t8ht"] Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.423455 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.441768 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76d6fd8967-rrlqd" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.445256 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.455874 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.493409 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjfhc\" (UniqueName: \"kubernetes.io/projected/5040c26c-bf75-4f8b-9ddd-5f5774467afb-kube-api-access-qjfhc\") pod \"dnsmasq-dns-688c87cc99-dxnfx\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.502872 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-969f6549d-4t8ht"] Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.522762 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-config-data-custom\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.525788 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-combined-ca-bundle\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.525990 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k9cl\" (UniqueName: \"kubernetes.io/projected/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-kube-api-access-5k9cl\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.528126 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-ovndb-tls-certs\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.528237 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0442bbe2-b143-4004-a1d8-e207e159423e-logs\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.528405 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-combined-ca-bundle\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.528672 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-httpd-config\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.528801 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m49wm\" (UniqueName: \"kubernetes.io/projected/0442bbe2-b143-4004-a1d8-e207e159423e-kube-api-access-m49wm\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.528833 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-config\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.528882 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-config-data\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.533354 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-combined-ca-bundle\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.534152 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-httpd-config\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.535093 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-config\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.535185 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-ovndb-tls-certs\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.552203 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k9cl\" (UniqueName: \"kubernetes.io/projected/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-kube-api-access-5k9cl\") pod \"neutron-6cf859c4fb-g2pzn\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.564863 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.616897 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.641968 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-combined-ca-bundle\") pod \"658da6f1-ac70-4d83-83ca-f79e69f0979d\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.642044 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h69rh\" (UniqueName: \"kubernetes.io/projected/658da6f1-ac70-4d83-83ca-f79e69f0979d-kube-api-access-h69rh\") pod \"658da6f1-ac70-4d83-83ca-f79e69f0979d\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.642094 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-db-sync-config-data\") pod \"658da6f1-ac70-4d83-83ca-f79e69f0979d\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.642184 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/658da6f1-ac70-4d83-83ca-f79e69f0979d-etc-machine-id\") pod \"658da6f1-ac70-4d83-83ca-f79e69f0979d\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.642268 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-scripts\") pod \"658da6f1-ac70-4d83-83ca-f79e69f0979d\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.642307 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-config-data\") pod \"658da6f1-ac70-4d83-83ca-f79e69f0979d\" (UID: \"658da6f1-ac70-4d83-83ca-f79e69f0979d\") " Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.642555 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-config-data-custom\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.642603 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0442bbe2-b143-4004-a1d8-e207e159423e-logs\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.642645 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-combined-ca-bundle\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.642695 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m49wm\" (UniqueName: \"kubernetes.io/projected/0442bbe2-b143-4004-a1d8-e207e159423e-kube-api-access-m49wm\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.642716 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-config-data\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.650913 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-scripts" (OuterVolumeSpecName: "scripts") pod "658da6f1-ac70-4d83-83ca-f79e69f0979d" (UID: "658da6f1-ac70-4d83-83ca-f79e69f0979d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.651736 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/658da6f1-ac70-4d83-83ca-f79e69f0979d-kube-api-access-h69rh" (OuterVolumeSpecName: "kube-api-access-h69rh") pod "658da6f1-ac70-4d83-83ca-f79e69f0979d" (UID: "658da6f1-ac70-4d83-83ca-f79e69f0979d"). InnerVolumeSpecName "kube-api-access-h69rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.657830 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658da6f1-ac70-4d83-83ca-f79e69f0979d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "658da6f1-ac70-4d83-83ca-f79e69f0979d" (UID: "658da6f1-ac70-4d83-83ca-f79e69f0979d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.658906 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "658da6f1-ac70-4d83-83ca-f79e69f0979d" (UID: "658da6f1-ac70-4d83-83ca-f79e69f0979d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.663135 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0442bbe2-b143-4004-a1d8-e207e159423e-logs\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.681975 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-config-data\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.683752 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-config-data-custom\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.687025 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-combined-ca-bundle\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.687731 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.701922 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m49wm\" (UniqueName: \"kubernetes.io/projected/0442bbe2-b143-4004-a1d8-e207e159423e-kube-api-access-m49wm\") pod \"barbican-api-969f6549d-4t8ht\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.702032 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "658da6f1-ac70-4d83-83ca-f79e69f0979d" (UID: "658da6f1-ac70-4d83-83ca-f79e69f0979d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.742099 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-config-data" (OuterVolumeSpecName: "config-data") pod "658da6f1-ac70-4d83-83ca-f79e69f0979d" (UID: "658da6f1-ac70-4d83-83ca-f79e69f0979d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.745355 4931 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/658da6f1-ac70-4d83-83ca-f79e69f0979d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.745406 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.745416 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.745424 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.745435 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h69rh\" (UniqueName: \"kubernetes.io/projected/658da6f1-ac70-4d83-83ca-f79e69f0979d-kube-api-access-h69rh\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.745448 4931 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/658da6f1-ac70-4d83-83ca-f79e69f0979d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.805109 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.816156 4931 generic.go:334] "Generic (PLEG): container finished" podID="6e1dea62-ecc2-401d-b436-acc91fba2d5d" containerID="0a878f0f4a2ce600e2d24d751569730947d20108a8b4d038586c5e6ec3ecd425" exitCode=0 Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.816205 4931 generic.go:334] "Generic (PLEG): container finished" podID="6e1dea62-ecc2-401d-b436-acc91fba2d5d" containerID="fa62247771a34f6f82920402fc848c1829a59ea972f86b85cba824d42267259b" exitCode=2 Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.816233 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1dea62-ecc2-401d-b436-acc91fba2d5d","Type":"ContainerDied","Data":"0a878f0f4a2ce600e2d24d751569730947d20108a8b4d038586c5e6ec3ecd425"} Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.816297 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1dea62-ecc2-401d-b436-acc91fba2d5d","Type":"ContainerDied","Data":"fa62247771a34f6f82920402fc848c1829a59ea972f86b85cba824d42267259b"} Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.817811 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cmc5n" event={"ID":"658da6f1-ac70-4d83-83ca-f79e69f0979d","Type":"ContainerDied","Data":"5adc16cb91c1386b583c653b6632be1423676e99f137a3df706caa3fdad7854f"} Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.817835 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5adc16cb91c1386b583c653b6632be1423676e99f137a3df706caa3fdad7854f" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.817925 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cmc5n" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.829566 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.843655 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.949450 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-ovsdbserver-nb\") pod \"fd10f983-b891-48d0-9537-24db2dcef8e0\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.949571 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-ovsdbserver-sb\") pod \"fd10f983-b891-48d0-9537-24db2dcef8e0\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.949599 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-config\") pod \"fd10f983-b891-48d0-9537-24db2dcef8e0\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.949631 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28brr\" (UniqueName: \"kubernetes.io/projected/fd10f983-b891-48d0-9537-24db2dcef8e0-kube-api-access-28brr\") pod \"fd10f983-b891-48d0-9537-24db2dcef8e0\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.949656 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-dns-swift-storage-0\") pod \"fd10f983-b891-48d0-9537-24db2dcef8e0\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.949675 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-dns-svc\") pod \"fd10f983-b891-48d0-9537-24db2dcef8e0\" (UID: \"fd10f983-b891-48d0-9537-24db2dcef8e0\") " Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.950453 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-config" (OuterVolumeSpecName: "config") pod "fd10f983-b891-48d0-9537-24db2dcef8e0" (UID: "fd10f983-b891-48d0-9537-24db2dcef8e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.951004 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fd10f983-b891-48d0-9537-24db2dcef8e0" (UID: "fd10f983-b891-48d0-9537-24db2dcef8e0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.951110 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fd10f983-b891-48d0-9537-24db2dcef8e0" (UID: "fd10f983-b891-48d0-9537-24db2dcef8e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.951261 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fd10f983-b891-48d0-9537-24db2dcef8e0" (UID: "fd10f983-b891-48d0-9537-24db2dcef8e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.951286 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fd10f983-b891-48d0-9537-24db2dcef8e0" (UID: "fd10f983-b891-48d0-9537-24db2dcef8e0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.952139 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.952163 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.952195 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.952207 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.952219 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd10f983-b891-48d0-9537-24db2dcef8e0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:57 crc kubenswrapper[4931]: I1201 15:20:57.955639 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd10f983-b891-48d0-9537-24db2dcef8e0-kube-api-access-28brr" (OuterVolumeSpecName: "kube-api-access-28brr") pod "fd10f983-b891-48d0-9537-24db2dcef8e0" (UID: "fd10f983-b891-48d0-9537-24db2dcef8e0"). InnerVolumeSpecName "kube-api-access-28brr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.000455 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.022723 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 15:20:58 crc kubenswrapper[4931]: E1201 15:20:58.023136 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1dea62-ecc2-401d-b436-acc91fba2d5d" containerName="proxy-httpd" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.023156 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1dea62-ecc2-401d-b436-acc91fba2d5d" containerName="proxy-httpd" Dec 01 15:20:58 crc kubenswrapper[4931]: E1201 15:20:58.023185 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658da6f1-ac70-4d83-83ca-f79e69f0979d" containerName="cinder-db-sync" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.023193 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="658da6f1-ac70-4d83-83ca-f79e69f0979d" containerName="cinder-db-sync" Dec 01 15:20:58 crc kubenswrapper[4931]: E1201 15:20:58.023206 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1dea62-ecc2-401d-b436-acc91fba2d5d" containerName="sg-core" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.023214 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1dea62-ecc2-401d-b436-acc91fba2d5d" containerName="sg-core" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.023422 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="658da6f1-ac70-4d83-83ca-f79e69f0979d" containerName="cinder-db-sync" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.023450 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1dea62-ecc2-401d-b436-acc91fba2d5d" containerName="proxy-httpd" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.023463 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1dea62-ecc2-401d-b436-acc91fba2d5d" containerName="sg-core" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.024517 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.033603 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.033909 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.034186 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5w9qw" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.034402 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.060339 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28brr\" (UniqueName: \"kubernetes.io/projected/fd10f983-b891-48d0-9537-24db2dcef8e0-kube-api-access-28brr\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.060418 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.125163 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-dxnfx"] Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.162884 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-combined-ca-bundle\") pod \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.163202 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-config-data\") pod \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.163355 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wtrw\" (UniqueName: \"kubernetes.io/projected/6e1dea62-ecc2-401d-b436-acc91fba2d5d-kube-api-access-2wtrw\") pod \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.163488 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-scripts\") pod \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.163661 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1dea62-ecc2-401d-b436-acc91fba2d5d-log-httpd\") pod \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.163775 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-sg-core-conf-yaml\") pod \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.163887 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1dea62-ecc2-401d-b436-acc91fba2d5d-run-httpd\") pod \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\" (UID: \"6e1dea62-ecc2-401d-b436-acc91fba2d5d\") " Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.164106 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-6h4cs"] Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.164287 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.164374 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e63c8e3a-39a0-4a76-9754-16250b21f1dc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.164463 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-scripts\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.164583 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6pbq\" (UniqueName: \"kubernetes.io/projected/e63c8e3a-39a0-4a76-9754-16250b21f1dc-kube-api-access-l6pbq\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.164658 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-config-data\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.164823 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.165717 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.176988 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1dea62-ecc2-401d-b436-acc91fba2d5d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e1dea62-ecc2-401d-b436-acc91fba2d5d" (UID: "6e1dea62-ecc2-401d-b436-acc91fba2d5d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.177273 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1dea62-ecc2-401d-b436-acc91fba2d5d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e1dea62-ecc2-401d-b436-acc91fba2d5d" (UID: "6e1dea62-ecc2-401d-b436-acc91fba2d5d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.177343 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1dea62-ecc2-401d-b436-acc91fba2d5d-kube-api-access-2wtrw" (OuterVolumeSpecName: "kube-api-access-2wtrw") pod "6e1dea62-ecc2-401d-b436-acc91fba2d5d" (UID: "6e1dea62-ecc2-401d-b436-acc91fba2d5d"). InnerVolumeSpecName "kube-api-access-2wtrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.197459 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-6h4cs"] Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.198302 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-scripts" (OuterVolumeSpecName: "scripts") pod "6e1dea62-ecc2-401d-b436-acc91fba2d5d" (UID: "6e1dea62-ecc2-401d-b436-acc91fba2d5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.218047 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6765f56b8d-pn88t"] Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.219615 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e1dea62-ecc2-401d-b436-acc91fba2d5d" (UID: "6e1dea62-ecc2-401d-b436-acc91fba2d5d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.261769 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e1dea62-ecc2-401d-b436-acc91fba2d5d" (UID: "6e1dea62-ecc2-401d-b436-acc91fba2d5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.261786 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-config-data" (OuterVolumeSpecName: "config-data") pod "6e1dea62-ecc2-401d-b436-acc91fba2d5d" (UID: "6e1dea62-ecc2-401d-b436-acc91fba2d5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.269874 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.270627 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.270873 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8lrl\" (UniqueName: \"kubernetes.io/projected/17ca6a4a-52ed-4de1-8265-8095615b7887-kube-api-access-d8lrl\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.270995 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.271169 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-config\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.271268 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6pbq\" (UniqueName: \"kubernetes.io/projected/e63c8e3a-39a0-4a76-9754-16250b21f1dc-kube-api-access-l6pbq\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.271378 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-config-data\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.271725 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.271871 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.272013 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e63c8e3a-39a0-4a76-9754-16250b21f1dc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.272111 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-scripts\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.272202 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.272334 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wtrw\" (UniqueName: \"kubernetes.io/projected/6e1dea62-ecc2-401d-b436-acc91fba2d5d-kube-api-access-2wtrw\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.272499 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.271905 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99c8e9db-f582-408e-a3ba-a0b583c3218c" path="/var/lib/kubelet/pods/99c8e9db-f582-408e-a3ba-a0b583c3218c/volumes" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.273252 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e63c8e3a-39a0-4a76-9754-16250b21f1dc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.272575 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1dea62-ecc2-401d-b436-acc91fba2d5d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.274024 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76d6fd8967-rrlqd"] Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.274220 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.274519 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e1dea62-ecc2-401d-b436-acc91fba2d5d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.274707 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.274783 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1dea62-ecc2-401d-b436-acc91fba2d5d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.277139 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.277852 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.278550 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-scripts\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.283676 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-config-data\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.293970 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6pbq\" (UniqueName: \"kubernetes.io/projected/e63c8e3a-39a0-4a76-9754-16250b21f1dc-kube-api-access-l6pbq\") pod \"cinder-scheduler-0\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.373354 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.374676 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.376208 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-config\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.376332 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.376356 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.376399 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.376436 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8lrl\" (UniqueName: \"kubernetes.io/projected/17ca6a4a-52ed-4de1-8265-8095615b7887-kube-api-access-d8lrl\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.376454 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.377196 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.378274 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-config\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.378842 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.380513 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.381210 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.381210 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.381863 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.396625 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.414819 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8lrl\" (UniqueName: \"kubernetes.io/projected/17ca6a4a-52ed-4de1-8265-8095615b7887-kube-api-access-d8lrl\") pod \"dnsmasq-dns-6bb4fc677f-6h4cs\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.482223 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-logs\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.488160 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.488226 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-config-data\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.488310 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-scripts\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.488347 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.488462 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-config-data-custom\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.488525 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t78l6\" (UniqueName: \"kubernetes.io/projected/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-kube-api-access-t78l6\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.495319 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-dxnfx"] Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.509263 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.589800 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-scripts\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.589876 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.589964 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-config-data-custom\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.590014 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t78l6\" (UniqueName: \"kubernetes.io/projected/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-kube-api-access-t78l6\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.590094 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-logs\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.590135 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.590168 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-config-data\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.592713 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.592893 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-logs\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.599886 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6cf859c4fb-g2pzn"] Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.602124 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.614199 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t78l6\" (UniqueName: \"kubernetes.io/projected/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-kube-api-access-t78l6\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.620521 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-config-data\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.626303 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-scripts\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.628884 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-config-data-custom\") pod \"cinder-api-0\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.643839 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-969f6549d-4t8ht"] Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.717712 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.841203 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cf859c4fb-g2pzn" event={"ID":"648f1c18-1467-4f3c-9ec3-8e1289c57a4f","Type":"ContainerStarted","Data":"23ff55ac72a9293b09f23682c396bcefce879e31dd82ba8f074f56ce45a62f7e"} Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.846510 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e1dea62-ecc2-401d-b436-acc91fba2d5d","Type":"ContainerDied","Data":"b7815eac10028557d88d23ec8ed3d80680d38060787c776b6694afb2da3f62a8"} Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.846546 4931 scope.go:117] "RemoveContainer" containerID="0a878f0f4a2ce600e2d24d751569730947d20108a8b4d038586c5e6ec3ecd425" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.846666 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.869075 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-969f6549d-4t8ht" event={"ID":"0442bbe2-b143-4004-a1d8-e207e159423e","Type":"ContainerStarted","Data":"c9118773c361ae090075db3b713b04fd11050c41582b4703a3713fc71dc63e1e"} Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.888022 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" event={"ID":"5040c26c-bf75-4f8b-9ddd-5f5774467afb","Type":"ContainerStarted","Data":"107e16f51737daa154e9b22b99248323af2a268c1bca76424face6e8a6b70351"} Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.898883 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76d6fd8967-rrlqd" event={"ID":"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5","Type":"ContainerStarted","Data":"6e2122774a12cfb75d195fae55725e982e5d1f613e248b421adb8ccb973028d3"} Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.900940 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" event={"ID":"05a83efe-a056-4531-b9f3-c6c4f87a9cdb","Type":"ContainerStarted","Data":"65f5f9c063d08c757f4be463a06c52f48f5cdd4086cf58cae1fb30da63273957"} Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.900957 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-kjjl9" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.937689 4931 scope.go:117] "RemoveContainer" containerID="fa62247771a34f6f82920402fc848c1829a59ea972f86b85cba824d42267259b" Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.966856 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:20:58 crc kubenswrapper[4931]: W1201 15:20:58.981123 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode63c8e3a_39a0_4a76_9754_16250b21f1dc.slice/crio-80f81a767b7c449b8c78b87e2d974293933df58edfa3c29710c734b6ca066cc2 WatchSource:0}: Error finding container 80f81a767b7c449b8c78b87e2d974293933df58edfa3c29710c734b6ca066cc2: Status 404 returned error can't find the container with id 80f81a767b7c449b8c78b87e2d974293933df58edfa3c29710c734b6ca066cc2 Dec 01 15:20:58 crc kubenswrapper[4931]: I1201 15:20:58.999455 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.009033 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.029377 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.031688 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.034650 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.037876 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.041181 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-kjjl9"] Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.060260 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.068965 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-kjjl9"] Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.100606 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.101599 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-6h4cs"] Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.102572 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-scripts\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.102697 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz99t\" (UniqueName: \"kubernetes.io/projected/cf71c332-a4fd-4709-9465-db8a45e1e5a7-kube-api-access-jz99t\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.102826 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf71c332-a4fd-4709-9465-db8a45e1e5a7-run-httpd\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.102939 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf71c332-a4fd-4709-9465-db8a45e1e5a7-log-httpd\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.103301 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.103498 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-config-data\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.206805 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-config-data\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.207133 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.207180 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-scripts\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.207197 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz99t\" (UniqueName: \"kubernetes.io/projected/cf71c332-a4fd-4709-9465-db8a45e1e5a7-kube-api-access-jz99t\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.207227 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf71c332-a4fd-4709-9465-db8a45e1e5a7-run-httpd\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.207252 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf71c332-a4fd-4709-9465-db8a45e1e5a7-log-httpd\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.207271 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.209584 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf71c332-a4fd-4709-9465-db8a45e1e5a7-log-httpd\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.210286 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf71c332-a4fd-4709-9465-db8a45e1e5a7-run-httpd\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.217925 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.217989 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.218535 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-config-data\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.222905 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz99t\" (UniqueName: \"kubernetes.io/projected/cf71c332-a4fd-4709-9465-db8a45e1e5a7-kube-api-access-jz99t\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.224236 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-scripts\") pod \"ceilometer-0\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.355104 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.369988 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.928456 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-969f6549d-4t8ht" event={"ID":"0442bbe2-b143-4004-a1d8-e207e159423e","Type":"ContainerStarted","Data":"fdd9563c0cb5345cf18ebaf800ec8ff6b1c1077865c86bf1579f8fb3aa155adf"} Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.929136 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-969f6549d-4t8ht" event={"ID":"0442bbe2-b143-4004-a1d8-e207e159423e","Type":"ContainerStarted","Data":"020ab7ec7b0bb107fd755ee78a7ced6530ee7d2d1d483d142b30a61b46295d2b"} Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.929163 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.930492 4931 generic.go:334] "Generic (PLEG): container finished" podID="17ca6a4a-52ed-4de1-8265-8095615b7887" containerID="f628d85b18b094f0613c32bc73bb9e82d66d9afcb95a886c8cc49d563bb9bb7b" exitCode=0 Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.930691 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" event={"ID":"17ca6a4a-52ed-4de1-8265-8095615b7887","Type":"ContainerDied","Data":"f628d85b18b094f0613c32bc73bb9e82d66d9afcb95a886c8cc49d563bb9bb7b"} Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.930745 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" event={"ID":"17ca6a4a-52ed-4de1-8265-8095615b7887","Type":"ContainerStarted","Data":"a0fb5a4367e410422062471ebd18e2d0c0b15c3f9c29ab5d397a63310d7fb6c8"} Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.937088 4931 generic.go:334] "Generic (PLEG): container finished" podID="5040c26c-bf75-4f8b-9ddd-5f5774467afb" containerID="f83a965d4d57c0812d20fe8b2533ce81c2d4b03b6b3136f4de5ddfd02deeaa1d" exitCode=0 Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.937202 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" event={"ID":"5040c26c-bf75-4f8b-9ddd-5f5774467afb","Type":"ContainerDied","Data":"f83a965d4d57c0812d20fe8b2533ce81c2d4b03b6b3136f4de5ddfd02deeaa1d"} Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.939182 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e63c8e3a-39a0-4a76-9754-16250b21f1dc","Type":"ContainerStarted","Data":"80f81a767b7c449b8c78b87e2d974293933df58edfa3c29710c734b6ca066cc2"} Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.945348 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cf859c4fb-g2pzn" event={"ID":"648f1c18-1467-4f3c-9ec3-8e1289c57a4f","Type":"ContainerStarted","Data":"c624b834b4c99846acbfd0f012a6bcd0efadaca9bd00aea534cdf91fea2fd53a"} Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.945423 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cf859c4fb-g2pzn" event={"ID":"648f1c18-1467-4f3c-9ec3-8e1289c57a4f","Type":"ContainerStarted","Data":"4d33ae8b0efc7fff609af493e5981b6538b653290a86dc610c8a12ad0a6c98c9"} Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.945605 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.954070 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35","Type":"ContainerStarted","Data":"2d21b3df0de5b4b7b4a6689606ea8ac993ead324bb0ab996da08445d6abfee60"} Dec 01 15:20:59 crc kubenswrapper[4931]: I1201 15:20:59.970527 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-969f6549d-4t8ht" podStartSLOduration=2.9705071 podStartE2EDuration="2.9705071s" podCreationTimestamp="2025-12-01 15:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:20:59.968525754 +0000 UTC m=+1206.394399441" watchObservedRunningTime="2025-12-01 15:20:59.9705071 +0000 UTC m=+1206.396380787" Dec 01 15:21:00 crc kubenswrapper[4931]: I1201 15:21:00.027996 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6cf859c4fb-g2pzn" podStartSLOduration=3.027972974 podStartE2EDuration="3.027972974s" podCreationTimestamp="2025-12-01 15:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:00.019320081 +0000 UTC m=+1206.445193758" watchObservedRunningTime="2025-12-01 15:21:00.027972974 +0000 UTC m=+1206.453846661" Dec 01 15:21:00 crc kubenswrapper[4931]: I1201 15:21:00.070685 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:00 crc kubenswrapper[4931]: I1201 15:21:00.260164 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e1dea62-ecc2-401d-b436-acc91fba2d5d" path="/var/lib/kubelet/pods/6e1dea62-ecc2-401d-b436-acc91fba2d5d/volumes" Dec 01 15:21:00 crc kubenswrapper[4931]: I1201 15:21:00.260934 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd10f983-b891-48d0-9537-24db2dcef8e0" path="/var/lib/kubelet/pods/fd10f983-b891-48d0-9537-24db2dcef8e0/volumes" Dec 01 15:21:00 crc kubenswrapper[4931]: I1201 15:21:00.778106 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 15:21:00 crc kubenswrapper[4931]: I1201 15:21:00.966976 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35","Type":"ContainerStarted","Data":"9dfb5a9b243efa01bd33b02dabf56331196b29c50a01cc2d9aa347f9eadbfee4"} Dec 01 15:21:00 crc kubenswrapper[4931]: I1201 15:21:00.967202 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:21:01 crc kubenswrapper[4931]: I1201 15:21:01.989549 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" event={"ID":"5040c26c-bf75-4f8b-9ddd-5f5774467afb","Type":"ContainerDied","Data":"107e16f51737daa154e9b22b99248323af2a268c1bca76424face6e8a6b70351"} Dec 01 15:21:01 crc kubenswrapper[4931]: I1201 15:21:01.990045 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="107e16f51737daa154e9b22b99248323af2a268c1bca76424face6e8a6b70351" Dec 01 15:21:01 crc kubenswrapper[4931]: I1201 15:21:01.991502 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf71c332-a4fd-4709-9465-db8a45e1e5a7","Type":"ContainerStarted","Data":"33abc3552173454383028cea5dc9f7d95e5b01a3adce90e411a0f399ff3dc026"} Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.019159 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.056536 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-dns-swift-storage-0\") pod \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.056606 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-config\") pod \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.056715 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjfhc\" (UniqueName: \"kubernetes.io/projected/5040c26c-bf75-4f8b-9ddd-5f5774467afb-kube-api-access-qjfhc\") pod \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.056995 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-ovsdbserver-sb\") pod \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.057041 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-ovsdbserver-nb\") pod \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.057106 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-dns-svc\") pod \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\" (UID: \"5040c26c-bf75-4f8b-9ddd-5f5774467afb\") " Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.063792 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5040c26c-bf75-4f8b-9ddd-5f5774467afb-kube-api-access-qjfhc" (OuterVolumeSpecName: "kube-api-access-qjfhc") pod "5040c26c-bf75-4f8b-9ddd-5f5774467afb" (UID: "5040c26c-bf75-4f8b-9ddd-5f5774467afb"). InnerVolumeSpecName "kube-api-access-qjfhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.160135 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjfhc\" (UniqueName: \"kubernetes.io/projected/5040c26c-bf75-4f8b-9ddd-5f5774467afb-kube-api-access-qjfhc\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.161848 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5040c26c-bf75-4f8b-9ddd-5f5774467afb" (UID: "5040c26c-bf75-4f8b-9ddd-5f5774467afb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.171114 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-config" (OuterVolumeSpecName: "config") pod "5040c26c-bf75-4f8b-9ddd-5f5774467afb" (UID: "5040c26c-bf75-4f8b-9ddd-5f5774467afb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.171486 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5040c26c-bf75-4f8b-9ddd-5f5774467afb" (UID: "5040c26c-bf75-4f8b-9ddd-5f5774467afb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.174136 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5040c26c-bf75-4f8b-9ddd-5f5774467afb" (UID: "5040c26c-bf75-4f8b-9ddd-5f5774467afb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.174341 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5040c26c-bf75-4f8b-9ddd-5f5774467afb" (UID: "5040c26c-bf75-4f8b-9ddd-5f5774467afb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.262069 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.262098 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.262110 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.262121 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:02 crc kubenswrapper[4931]: I1201 15:21:02.262131 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5040c26c-bf75-4f8b-9ddd-5f5774467afb-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.001769 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" event={"ID":"05a83efe-a056-4531-b9f3-c6c4f87a9cdb","Type":"ContainerStarted","Data":"6267af44727568147728977b213b422e6aad5481d22f94d9cbf64152b21731ba"} Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.004323 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e63c8e3a-39a0-4a76-9754-16250b21f1dc","Type":"ContainerStarted","Data":"0677dddc9efbc160c5fd369586fa7c40a17a08ad47a842dad050d6c4c3c43856"} Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.006154 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" event={"ID":"17ca6a4a-52ed-4de1-8265-8095615b7887","Type":"ContainerStarted","Data":"b2649c121ccc78a4cee5705404f61c30d44525dcdc9de0e0a6516b4d17477a75"} Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.006205 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-dxnfx" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.041196 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" podStartSLOduration=5.041174661 podStartE2EDuration="5.041174661s" podCreationTimestamp="2025-12-01 15:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:03.028102574 +0000 UTC m=+1209.453976281" watchObservedRunningTime="2025-12-01 15:21:03.041174661 +0000 UTC m=+1209.467048328" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.088375 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-dxnfx"] Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.088451 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-dxnfx"] Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.510153 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.527019 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f86b4c4b5-zs55r"] Dec 01 15:21:03 crc kubenswrapper[4931]: E1201 15:21:03.527488 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5040c26c-bf75-4f8b-9ddd-5f5774467afb" containerName="init" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.527505 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="5040c26c-bf75-4f8b-9ddd-5f5774467afb" containerName="init" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.527741 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="5040c26c-bf75-4f8b-9ddd-5f5774467afb" containerName="init" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.529123 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.531711 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.532706 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.539023 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f86b4c4b5-zs55r"] Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.587609 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-public-tls-certs\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.587673 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-combined-ca-bundle\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.587733 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdnh4\" (UniqueName: \"kubernetes.io/projected/38afe8fe-5f17-4be2-84b6-da4211785ed1-kube-api-access-cdnh4\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.587759 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-config\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.587788 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-httpd-config\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.587814 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-ovndb-tls-certs\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.587838 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-internal-tls-certs\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.689239 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-combined-ca-bundle\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.689361 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdnh4\" (UniqueName: \"kubernetes.io/projected/38afe8fe-5f17-4be2-84b6-da4211785ed1-kube-api-access-cdnh4\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.689452 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-config\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.689488 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-httpd-config\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.689535 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-ovndb-tls-certs\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.689617 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-internal-tls-certs\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.689651 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-public-tls-certs\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.713300 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-ovndb-tls-certs\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.713814 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-internal-tls-certs\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.722245 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-config\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.734070 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-httpd-config\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.737415 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-public-tls-certs\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.741147 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdnh4\" (UniqueName: \"kubernetes.io/projected/38afe8fe-5f17-4be2-84b6-da4211785ed1-kube-api-access-cdnh4\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.749346 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38afe8fe-5f17-4be2-84b6-da4211785ed1-combined-ca-bundle\") pod \"neutron-6f86b4c4b5-zs55r\" (UID: \"38afe8fe-5f17-4be2-84b6-da4211785ed1\") " pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:03 crc kubenswrapper[4931]: I1201 15:21:03.909429 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:04 crc kubenswrapper[4931]: I1201 15:21:04.020242 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf71c332-a4fd-4709-9465-db8a45e1e5a7","Type":"ContainerStarted","Data":"9f849267533ea27449b15fbc7a1db9e8b8c2c36ed33166cb3a4ad826115f0e97"} Dec 01 15:21:04 crc kubenswrapper[4931]: I1201 15:21:04.022196 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" event={"ID":"05a83efe-a056-4531-b9f3-c6c4f87a9cdb","Type":"ContainerStarted","Data":"6102a5b49968fca0ea7bfc84fd75c5a83f2b3c76cc351b7d3c105eb99d43dc19"} Dec 01 15:21:04 crc kubenswrapper[4931]: I1201 15:21:04.024309 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e63c8e3a-39a0-4a76-9754-16250b21f1dc","Type":"ContainerStarted","Data":"e786c9aa68ee65814a885c19726d9771efe252c66f2ca69dbb6d4b093b88c575"} Dec 01 15:21:04 crc kubenswrapper[4931]: I1201 15:21:04.026450 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" containerName="cinder-api-log" containerID="cri-o://9dfb5a9b243efa01bd33b02dabf56331196b29c50a01cc2d9aa347f9eadbfee4" gracePeriod=30 Dec 01 15:21:04 crc kubenswrapper[4931]: I1201 15:21:04.026679 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35","Type":"ContainerStarted","Data":"505a4c722f8adc4c972a03b2188dc88a288cc5d4f10ab7ad614c562cd4af6702"} Dec 01 15:21:04 crc kubenswrapper[4931]: I1201 15:21:04.026705 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 15:21:04 crc kubenswrapper[4931]: I1201 15:21:04.026733 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" containerName="cinder-api" containerID="cri-o://505a4c722f8adc4c972a03b2188dc88a288cc5d4f10ab7ad614c562cd4af6702" gracePeriod=30 Dec 01 15:21:04 crc kubenswrapper[4931]: I1201 15:21:04.041786 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6765f56b8d-pn88t" podStartSLOduration=4.389758668 podStartE2EDuration="8.041767009s" podCreationTimestamp="2025-12-01 15:20:56 +0000 UTC" firstStartedPulling="2025-12-01 15:20:58.236448348 +0000 UTC m=+1204.662322005" lastFinishedPulling="2025-12-01 15:21:01.888456679 +0000 UTC m=+1208.314330346" observedRunningTime="2025-12-01 15:21:04.039617538 +0000 UTC m=+1210.465491195" watchObservedRunningTime="2025-12-01 15:21:04.041767009 +0000 UTC m=+1210.467640676" Dec 01 15:21:04 crc kubenswrapper[4931]: I1201 15:21:04.074470 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.074452437 podStartE2EDuration="6.074452437s" podCreationTimestamp="2025-12-01 15:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:04.060253038 +0000 UTC m=+1210.486126705" watchObservedRunningTime="2025-12-01 15:21:04.074452437 +0000 UTC m=+1210.500326104" Dec 01 15:21:04 crc kubenswrapper[4931]: I1201 15:21:04.260191 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5040c26c-bf75-4f8b-9ddd-5f5774467afb" path="/var/lib/kubelet/pods/5040c26c-bf75-4f8b-9ddd-5f5774467afb/volumes" Dec 01 15:21:04 crc kubenswrapper[4931]: I1201 15:21:04.527041 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f86b4c4b5-zs55r"] Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.049108 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f86b4c4b5-zs55r" event={"ID":"38afe8fe-5f17-4be2-84b6-da4211785ed1","Type":"ContainerStarted","Data":"60705c0bff04d7df92f2555f3dcc806eae472b96715cd75a2d8a3c9c9b8584ec"} Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.050373 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f86b4c4b5-zs55r" event={"ID":"38afe8fe-5f17-4be2-84b6-da4211785ed1","Type":"ContainerStarted","Data":"1d6ef2ab7ead44e2d302f8081ade807cdaeabef921be338eb9c814cca34001e6"} Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.069036 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf71c332-a4fd-4709-9465-db8a45e1e5a7","Type":"ContainerStarted","Data":"fc01f84aba9de69b5c38161a1e55ec6d3cb25f9d688143aba884db04a34e939b"} Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.083909 4931 generic.go:334] "Generic (PLEG): container finished" podID="f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" containerID="505a4c722f8adc4c972a03b2188dc88a288cc5d4f10ab7ad614c562cd4af6702" exitCode=0 Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.083944 4931 generic.go:334] "Generic (PLEG): container finished" podID="f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" containerID="9dfb5a9b243efa01bd33b02dabf56331196b29c50a01cc2d9aa347f9eadbfee4" exitCode=143 Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.084028 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35","Type":"ContainerDied","Data":"505a4c722f8adc4c972a03b2188dc88a288cc5d4f10ab7ad614c562cd4af6702"} Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.084067 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.084107 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35","Type":"ContainerDied","Data":"9dfb5a9b243efa01bd33b02dabf56331196b29c50a01cc2d9aa347f9eadbfee4"} Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.138571 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.266042236 podStartE2EDuration="8.13855377s" podCreationTimestamp="2025-12-01 15:20:57 +0000 UTC" firstStartedPulling="2025-12-01 15:20:58.997891278 +0000 UTC m=+1205.423764935" lastFinishedPulling="2025-12-01 15:21:01.870402802 +0000 UTC m=+1208.296276469" observedRunningTime="2025-12-01 15:21:05.110238354 +0000 UTC m=+1211.536112031" watchObservedRunningTime="2025-12-01 15:21:05.13855377 +0000 UTC m=+1211.564427437" Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.143870 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-config-data\") pod \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.144083 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-etc-machine-id\") pod \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.144151 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-combined-ca-bundle\") pod \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.144223 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-config-data-custom\") pod \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.144309 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" (UID: "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.144332 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-scripts\") pod \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.144495 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-logs\") pod \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.144907 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t78l6\" (UniqueName: \"kubernetes.io/projected/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-kube-api-access-t78l6\") pod \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.145485 4931 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.155710 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-logs" (OuterVolumeSpecName: "logs") pod "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" (UID: "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.156014 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" (UID: "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.156311 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-scripts" (OuterVolumeSpecName: "scripts") pod "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" (UID: "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.185108 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-kube-api-access-t78l6" (OuterVolumeSpecName: "kube-api-access-t78l6") pod "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" (UID: "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35"). InnerVolumeSpecName "kube-api-access-t78l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.246798 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.246826 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.246837 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.246845 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t78l6\" (UniqueName: \"kubernetes.io/projected/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-kube-api-access-t78l6\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:05 crc kubenswrapper[4931]: E1201 15:21:05.369942 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-combined-ca-bundle podName:f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35 nodeName:}" failed. No retries permitted until 2025-12-01 15:21:05.869906059 +0000 UTC m=+1212.295779716 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-combined-ca-bundle") pod "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" (UID: "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35") : error deleting /var/lib/kubelet/pods/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35/volume-subpaths: remove /var/lib/kubelet/pods/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35/volume-subpaths: no such file or directory Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.373024 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-config-data" (OuterVolumeSpecName: "config-data") pod "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" (UID: "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.449060 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.955012 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-combined-ca-bundle\") pod \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\" (UID: \"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35\") " Dec 01 15:21:05 crc kubenswrapper[4931]: I1201 15:21:05.965967 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" (UID: "f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.077320 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.096923 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76d6fd8967-rrlqd" event={"ID":"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5","Type":"ContainerStarted","Data":"ea2b89720640eb54cf7ac164efa5546cab729b2ac8e55ff62316f7eeff966b62"} Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.098820 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35","Type":"ContainerDied","Data":"2d21b3df0de5b4b7b4a6689606ea8ac993ead324bb0ab996da08445d6abfee60"} Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.098862 4931 scope.go:117] "RemoveContainer" containerID="505a4c722f8adc4c972a03b2188dc88a288cc5d4f10ab7ad614c562cd4af6702" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.098890 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.103403 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f86b4c4b5-zs55r" event={"ID":"38afe8fe-5f17-4be2-84b6-da4211785ed1","Type":"ContainerStarted","Data":"47dbc4de4f5077c824110fd7fbb92e0a35d238040b130ecde4318a61fcdd637b"} Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.125548 4931 scope.go:117] "RemoveContainer" containerID="9dfb5a9b243efa01bd33b02dabf56331196b29c50a01cc2d9aa347f9eadbfee4" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.145773 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.213444 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.230575 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 15:21:06 crc kubenswrapper[4931]: E1201 15:21:06.230962 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" containerName="cinder-api-log" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.230982 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" containerName="cinder-api-log" Dec 01 15:21:06 crc kubenswrapper[4931]: E1201 15:21:06.231010 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" containerName="cinder-api" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.231017 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" containerName="cinder-api" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.231167 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" containerName="cinder-api-log" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.231186 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" containerName="cinder-api" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.232085 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.235261 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.235607 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.235826 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.277822 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35" path="/var/lib/kubelet/pods/f8a52313-5d05-4dfb-a2ea-b9f4b09b6c35/volumes" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.278363 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.283013 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mfjg\" (UniqueName: \"kubernetes.io/projected/5d91c4a2-739a-4533-a21b-9aa362069d32-kube-api-access-5mfjg\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.283290 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.283361 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.283506 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-scripts\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.283543 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-config-data\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.283564 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d91c4a2-739a-4533-a21b-9aa362069d32-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.283607 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d91c4a2-739a-4533-a21b-9aa362069d32-logs\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.283718 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-config-data-custom\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.283806 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.385812 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-scripts\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.385858 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-config-data\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.385877 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d91c4a2-739a-4533-a21b-9aa362069d32-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.385912 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d91c4a2-739a-4533-a21b-9aa362069d32-logs\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.385961 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-config-data-custom\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.385993 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.386022 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mfjg\" (UniqueName: \"kubernetes.io/projected/5d91c4a2-739a-4533-a21b-9aa362069d32-kube-api-access-5mfjg\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.386050 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.386077 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.386439 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d91c4a2-739a-4533-a21b-9aa362069d32-logs\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.386540 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d91c4a2-739a-4533-a21b-9aa362069d32-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.391936 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-config-data-custom\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.392441 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-config-data\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.395996 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.401876 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.402530 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.403134 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d91c4a2-739a-4533-a21b-9aa362069d32-scripts\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.409644 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mfjg\" (UniqueName: \"kubernetes.io/projected/5d91c4a2-739a-4533-a21b-9aa362069d32-kube-api-access-5mfjg\") pod \"cinder-api-0\" (UID: \"5d91c4a2-739a-4533-a21b-9aa362069d32\") " pod="openstack/cinder-api-0" Dec 01 15:21:06 crc kubenswrapper[4931]: I1201 15:21:06.573912 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 15:21:07 crc kubenswrapper[4931]: I1201 15:21:07.113920 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf71c332-a4fd-4709-9465-db8a45e1e5a7","Type":"ContainerStarted","Data":"340370a8052e60d9faef18c959ef8c8ea5620847409fc6f64ab48d97e78238bc"} Dec 01 15:21:07 crc kubenswrapper[4931]: I1201 15:21:07.116035 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76d6fd8967-rrlqd" event={"ID":"4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5","Type":"ContainerStarted","Data":"e3a0884253f51df56595c61af39274cd0d04b0a94be50f0a13c8568346624d5b"} Dec 01 15:21:07 crc kubenswrapper[4931]: I1201 15:21:07.117396 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:07 crc kubenswrapper[4931]: I1201 15:21:07.136824 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-76d6fd8967-rrlqd" podStartSLOduration=3.383894782 podStartE2EDuration="10.136807473s" podCreationTimestamp="2025-12-01 15:20:57 +0000 UTC" firstStartedPulling="2025-12-01 15:20:58.263843327 +0000 UTC m=+1204.689716994" lastFinishedPulling="2025-12-01 15:21:05.016756018 +0000 UTC m=+1211.442629685" observedRunningTime="2025-12-01 15:21:07.135917398 +0000 UTC m=+1213.561791065" watchObservedRunningTime="2025-12-01 15:21:07.136807473 +0000 UTC m=+1213.562681140" Dec 01 15:21:07 crc kubenswrapper[4931]: W1201 15:21:07.149475 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d91c4a2_739a_4533_a21b_9aa362069d32.slice/crio-2614c614d798e7064aa7f0907a400592dcbe2661afe2027c512e54aef57f68f4 WatchSource:0}: Error finding container 2614c614d798e7064aa7f0907a400592dcbe2661afe2027c512e54aef57f68f4: Status 404 returned error can't find the container with id 2614c614d798e7064aa7f0907a400592dcbe2661afe2027c512e54aef57f68f4 Dec 01 15:21:07 crc kubenswrapper[4931]: I1201 15:21:07.154583 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 15:21:07 crc kubenswrapper[4931]: I1201 15:21:07.169532 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f86b4c4b5-zs55r" podStartSLOduration=4.169514532 podStartE2EDuration="4.169514532s" podCreationTimestamp="2025-12-01 15:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:07.164380068 +0000 UTC m=+1213.590253735" watchObservedRunningTime="2025-12-01 15:21:07.169514532 +0000 UTC m=+1213.595388199" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.150537 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5d91c4a2-739a-4533-a21b-9aa362069d32","Type":"ContainerStarted","Data":"fa00dc6f56eed057b13d9451319718198af1cbeb610d35e7406627fcdbdd9833"} Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.150848 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5d91c4a2-739a-4533-a21b-9aa362069d32","Type":"ContainerStarted","Data":"2614c614d798e7064aa7f0907a400592dcbe2661afe2027c512e54aef57f68f4"} Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.237844 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-779b959886-w78q9"] Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.256023 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.262135 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.267611 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.324521 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-combined-ca-bundle\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.324583 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-config-data-custom\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.324603 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-config-data\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.324626 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-internal-tls-certs\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.324666 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6kdw\" (UniqueName: \"kubernetes.io/projected/93a16952-5c17-428c-b560-4a661b8b5416-kube-api-access-q6kdw\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.324725 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93a16952-5c17-428c-b560-4a661b8b5416-logs\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.324766 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-public-tls-certs\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.328405 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-779b959886-w78q9"] Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.384874 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.429478 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-combined-ca-bundle\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.429528 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-config-data-custom\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.429552 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-config-data\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.429587 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-internal-tls-certs\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.429642 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6kdw\" (UniqueName: \"kubernetes.io/projected/93a16952-5c17-428c-b560-4a661b8b5416-kube-api-access-q6kdw\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.429704 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93a16952-5c17-428c-b560-4a661b8b5416-logs\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.429745 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-public-tls-certs\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.442468 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-combined-ca-bundle\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.443869 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93a16952-5c17-428c-b560-4a661b8b5416-logs\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.445603 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-config-data-custom\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.458361 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-config-data\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.470836 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-public-tls-certs\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.510166 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6kdw\" (UniqueName: \"kubernetes.io/projected/93a16952-5c17-428c-b560-4a661b8b5416-kube-api-access-q6kdw\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.511770 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.514315 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93a16952-5c17-428c-b560-4a661b8b5416-internal-tls-certs\") pod \"barbican-api-779b959886-w78q9\" (UID: \"93a16952-5c17-428c-b560-4a661b8b5416\") " pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.618227 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.649658 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-2mz7d"] Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.650008 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" podUID="8da5b32f-da6b-48c5-8cf3-3f1699e1774b" containerName="dnsmasq-dns" containerID="cri-o://a82bc2220298726e60cfbf6331b995975602f5be1a0173e3ac1a2586d86e4ef2" gracePeriod=10 Dec 01 15:21:08 crc kubenswrapper[4931]: I1201 15:21:08.933688 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.175649 4931 generic.go:334] "Generic (PLEG): container finished" podID="8da5b32f-da6b-48c5-8cf3-3f1699e1774b" containerID="a82bc2220298726e60cfbf6331b995975602f5be1a0173e3ac1a2586d86e4ef2" exitCode=0 Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.175982 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" event={"ID":"8da5b32f-da6b-48c5-8cf3-3f1699e1774b","Type":"ContainerDied","Data":"a82bc2220298726e60cfbf6331b995975602f5be1a0173e3ac1a2586d86e4ef2"} Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.191566 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5d91c4a2-739a-4533-a21b-9aa362069d32","Type":"ContainerStarted","Data":"ba40ae0f95b12cce53d55b0d79dc4dec7b653147912687762c24bffbf9aa4dba"} Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.191611 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.207932 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-779b959886-w78q9"] Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.216623 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf71c332-a4fd-4709-9465-db8a45e1e5a7","Type":"ContainerStarted","Data":"d7bdef9f5f4ff27e3daa7252a373c1ac4bbaa741ae163a7c18fb9e488e289c6e"} Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.216660 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.225207 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.22518604 podStartE2EDuration="3.22518604s" podCreationTimestamp="2025-12-01 15:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:09.214944673 +0000 UTC m=+1215.640818340" watchObservedRunningTime="2025-12-01 15:21:09.22518604 +0000 UTC m=+1215.651059707" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.277848 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.295772 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.112510155 podStartE2EDuration="11.295751623s" podCreationTimestamp="2025-12-01 15:20:58 +0000 UTC" firstStartedPulling="2025-12-01 15:21:01.856745218 +0000 UTC m=+1208.282618885" lastFinishedPulling="2025-12-01 15:21:08.039986686 +0000 UTC m=+1214.465860353" observedRunningTime="2025-12-01 15:21:09.260691568 +0000 UTC m=+1215.686565235" watchObservedRunningTime="2025-12-01 15:21:09.295751623 +0000 UTC m=+1215.721625290" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.347274 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.450243 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-config\") pod \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.450418 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-ovsdbserver-sb\") pod \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.450467 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-ovsdbserver-nb\") pod \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.450488 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-dns-swift-storage-0\") pod \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.450510 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-dns-svc\") pod \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.450537 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9j6h\" (UniqueName: \"kubernetes.io/projected/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-kube-api-access-f9j6h\") pod \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\" (UID: \"8da5b32f-da6b-48c5-8cf3-3f1699e1774b\") " Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.477565 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-kube-api-access-f9j6h" (OuterVolumeSpecName: "kube-api-access-f9j6h") pod "8da5b32f-da6b-48c5-8cf3-3f1699e1774b" (UID: "8da5b32f-da6b-48c5-8cf3-3f1699e1774b"). InnerVolumeSpecName "kube-api-access-f9j6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.553459 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9j6h\" (UniqueName: \"kubernetes.io/projected/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-kube-api-access-f9j6h\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.585095 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8da5b32f-da6b-48c5-8cf3-3f1699e1774b" (UID: "8da5b32f-da6b-48c5-8cf3-3f1699e1774b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.590825 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8da5b32f-da6b-48c5-8cf3-3f1699e1774b" (UID: "8da5b32f-da6b-48c5-8cf3-3f1699e1774b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.596638 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-config" (OuterVolumeSpecName: "config") pod "8da5b32f-da6b-48c5-8cf3-3f1699e1774b" (UID: "8da5b32f-da6b-48c5-8cf3-3f1699e1774b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.597363 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8da5b32f-da6b-48c5-8cf3-3f1699e1774b" (UID: "8da5b32f-da6b-48c5-8cf3-3f1699e1774b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.638868 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8da5b32f-da6b-48c5-8cf3-3f1699e1774b" (UID: "8da5b32f-da6b-48c5-8cf3-3f1699e1774b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.656329 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.656372 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.656407 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.656422 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.656438 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8da5b32f-da6b-48c5-8cf3-3f1699e1774b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.812626 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:21:09 crc kubenswrapper[4931]: I1201 15:21:09.956744 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.224503 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" event={"ID":"8da5b32f-da6b-48c5-8cf3-3f1699e1774b","Type":"ContainerDied","Data":"f365c2db35cacae74456068495322990007bcb8ff02a92f4f3a04c63857be40d"} Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.224551 4931 scope.go:117] "RemoveContainer" containerID="a82bc2220298726e60cfbf6331b995975602f5be1a0173e3ac1a2586d86e4ef2" Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.224608 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.227983 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-779b959886-w78q9" event={"ID":"93a16952-5c17-428c-b560-4a661b8b5416","Type":"ContainerStarted","Data":"3f9af3f412cf2817181a5b42fc6d2fed0daaaede956f35f01c51b55a98b46b62"} Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.228020 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-779b959886-w78q9" event={"ID":"93a16952-5c17-428c-b560-4a661b8b5416","Type":"ContainerStarted","Data":"8ac6709a9711a36146f51913faded8aa606fe60caa56b3b1705d633d4a9f0f5e"} Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.228034 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.228044 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-779b959886-w78q9" event={"ID":"93a16952-5c17-428c-b560-4a661b8b5416","Type":"ContainerStarted","Data":"bb2489268d84de930af2bad8fe9cbac6c5d6dce1fcd92464b08566b57c1a405b"} Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.228170 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e63c8e3a-39a0-4a76-9754-16250b21f1dc" containerName="cinder-scheduler" containerID="cri-o://0677dddc9efbc160c5fd369586fa7c40a17a08ad47a842dad050d6c4c3c43856" gracePeriod=30 Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.228978 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.229030 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e63c8e3a-39a0-4a76-9754-16250b21f1dc" containerName="probe" containerID="cri-o://e786c9aa68ee65814a885c19726d9771efe252c66f2ca69dbb6d4b093b88c575" gracePeriod=30 Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.252763 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-779b959886-w78q9" podStartSLOduration=2.252748797 podStartE2EDuration="2.252748797s" podCreationTimestamp="2025-12-01 15:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:10.249497055 +0000 UTC m=+1216.675370762" watchObservedRunningTime="2025-12-01 15:21:10.252748797 +0000 UTC m=+1216.678622464" Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.256121 4931 scope.go:117] "RemoveContainer" containerID="e265b69b915159314ed4ce14cded3338ecf91fe0215ac4f5ec0b072268421a3d" Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.284052 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-2mz7d"] Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.291577 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-2mz7d"] Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.404475 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:21:10 crc kubenswrapper[4931]: I1201 15:21:10.546450 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.239423 4931 generic.go:334] "Generic (PLEG): container finished" podID="e63c8e3a-39a0-4a76-9754-16250b21f1dc" containerID="e786c9aa68ee65814a885c19726d9771efe252c66f2ca69dbb6d4b093b88c575" exitCode=0 Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.239498 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e63c8e3a-39a0-4a76-9754-16250b21f1dc","Type":"ContainerDied","Data":"e786c9aa68ee65814a885c19726d9771efe252c66f2ca69dbb6d4b093b88c575"} Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.747136 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.932629 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e63c8e3a-39a0-4a76-9754-16250b21f1dc-etc-machine-id\") pod \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.932690 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-config-data-custom\") pod \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.932788 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-combined-ca-bundle\") pod \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.932817 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-config-data\") pod \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.932815 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e63c8e3a-39a0-4a76-9754-16250b21f1dc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e63c8e3a-39a0-4a76-9754-16250b21f1dc" (UID: "e63c8e3a-39a0-4a76-9754-16250b21f1dc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.932858 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6pbq\" (UniqueName: \"kubernetes.io/projected/e63c8e3a-39a0-4a76-9754-16250b21f1dc-kube-api-access-l6pbq\") pod \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.933003 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-scripts\") pod \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\" (UID: \"e63c8e3a-39a0-4a76-9754-16250b21f1dc\") " Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.933577 4931 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e63c8e3a-39a0-4a76-9754-16250b21f1dc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.953600 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e63c8e3a-39a0-4a76-9754-16250b21f1dc-kube-api-access-l6pbq" (OuterVolumeSpecName: "kube-api-access-l6pbq") pod "e63c8e3a-39a0-4a76-9754-16250b21f1dc" (UID: "e63c8e3a-39a0-4a76-9754-16250b21f1dc"). InnerVolumeSpecName "kube-api-access-l6pbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.953735 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.953947 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e63c8e3a-39a0-4a76-9754-16250b21f1dc" (UID: "e63c8e3a-39a0-4a76-9754-16250b21f1dc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:11 crc kubenswrapper[4931]: I1201 15:21:11.958323 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-scripts" (OuterVolumeSpecName: "scripts") pod "e63c8e3a-39a0-4a76-9754-16250b21f1dc" (UID: "e63c8e3a-39a0-4a76-9754-16250b21f1dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.026540 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e63c8e3a-39a0-4a76-9754-16250b21f1dc" (UID: "e63c8e3a-39a0-4a76-9754-16250b21f1dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.036283 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.036313 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.036322 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6pbq\" (UniqueName: \"kubernetes.io/projected/e63c8e3a-39a0-4a76-9754-16250b21f1dc-kube-api-access-l6pbq\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.036333 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.055683 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-65c944c654-l6mmj" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.134401 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-config-data" (OuterVolumeSpecName: "config-data") pod "e63c8e3a-39a0-4a76-9754-16250b21f1dc" (UID: "e63c8e3a-39a0-4a76-9754-16250b21f1dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.138078 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e63c8e3a-39a0-4a76-9754-16250b21f1dc-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.153298 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6479b7c68-txrvx"] Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.310817 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8da5b32f-da6b-48c5-8cf3-3f1699e1774b" path="/var/lib/kubelet/pods/8da5b32f-da6b-48c5-8cf3-3f1699e1774b/volumes" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.334735 4931 generic.go:334] "Generic (PLEG): container finished" podID="e63c8e3a-39a0-4a76-9754-16250b21f1dc" containerID="0677dddc9efbc160c5fd369586fa7c40a17a08ad47a842dad050d6c4c3c43856" exitCode=0 Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.335566 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.345334 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e63c8e3a-39a0-4a76-9754-16250b21f1dc","Type":"ContainerDied","Data":"0677dddc9efbc160c5fd369586fa7c40a17a08ad47a842dad050d6c4c3c43856"} Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.345441 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e63c8e3a-39a0-4a76-9754-16250b21f1dc","Type":"ContainerDied","Data":"80f81a767b7c449b8c78b87e2d974293933df58edfa3c29710c734b6ca066cc2"} Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.345513 4931 scope.go:117] "RemoveContainer" containerID="e786c9aa68ee65814a885c19726d9771efe252c66f2ca69dbb6d4b093b88c575" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.345871 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6479b7c68-txrvx" podUID="97ed61f3-8ca0-4aee-afae-168398babe70" containerName="horizon-log" containerID="cri-o://91144dfad5dcee4bce9e3dffbc29f3988a0e250ee32ee1cfa8b7b6b0c59d19c5" gracePeriod=30 Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.346514 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6479b7c68-txrvx" podUID="97ed61f3-8ca0-4aee-afae-168398babe70" containerName="horizon" containerID="cri-o://63a67a4f00ff8d02071cd5c46b0692c9f484b2d61acc41dcf8b2b53a1f51fdb8" gracePeriod=30 Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.382927 4931 scope.go:117] "RemoveContainer" containerID="0677dddc9efbc160c5fd369586fa7c40a17a08ad47a842dad050d6c4c3c43856" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.393064 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.418604 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.421033 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 15:21:12 crc kubenswrapper[4931]: E1201 15:21:12.421409 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63c8e3a-39a0-4a76-9754-16250b21f1dc" containerName="cinder-scheduler" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.421421 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63c8e3a-39a0-4a76-9754-16250b21f1dc" containerName="cinder-scheduler" Dec 01 15:21:12 crc kubenswrapper[4931]: E1201 15:21:12.421436 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da5b32f-da6b-48c5-8cf3-3f1699e1774b" containerName="init" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.421441 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da5b32f-da6b-48c5-8cf3-3f1699e1774b" containerName="init" Dec 01 15:21:12 crc kubenswrapper[4931]: E1201 15:21:12.421465 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da5b32f-da6b-48c5-8cf3-3f1699e1774b" containerName="dnsmasq-dns" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.421472 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da5b32f-da6b-48c5-8cf3-3f1699e1774b" containerName="dnsmasq-dns" Dec 01 15:21:12 crc kubenswrapper[4931]: E1201 15:21:12.421492 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63c8e3a-39a0-4a76-9754-16250b21f1dc" containerName="probe" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.421497 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63c8e3a-39a0-4a76-9754-16250b21f1dc" containerName="probe" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.421655 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e63c8e3a-39a0-4a76-9754-16250b21f1dc" containerName="probe" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.421672 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da5b32f-da6b-48c5-8cf3-3f1699e1774b" containerName="dnsmasq-dns" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.421682 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e63c8e3a-39a0-4a76-9754-16250b21f1dc" containerName="cinder-scheduler" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.422536 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.429074 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.433311 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.440846 4931 scope.go:117] "RemoveContainer" containerID="e786c9aa68ee65814a885c19726d9771efe252c66f2ca69dbb6d4b093b88c575" Dec 01 15:21:12 crc kubenswrapper[4931]: E1201 15:21:12.441922 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e786c9aa68ee65814a885c19726d9771efe252c66f2ca69dbb6d4b093b88c575\": container with ID starting with e786c9aa68ee65814a885c19726d9771efe252c66f2ca69dbb6d4b093b88c575 not found: ID does not exist" containerID="e786c9aa68ee65814a885c19726d9771efe252c66f2ca69dbb6d4b093b88c575" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.441970 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e786c9aa68ee65814a885c19726d9771efe252c66f2ca69dbb6d4b093b88c575"} err="failed to get container status \"e786c9aa68ee65814a885c19726d9771efe252c66f2ca69dbb6d4b093b88c575\": rpc error: code = NotFound desc = could not find container \"e786c9aa68ee65814a885c19726d9771efe252c66f2ca69dbb6d4b093b88c575\": container with ID starting with e786c9aa68ee65814a885c19726d9771efe252c66f2ca69dbb6d4b093b88c575 not found: ID does not exist" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.441995 4931 scope.go:117] "RemoveContainer" containerID="0677dddc9efbc160c5fd369586fa7c40a17a08ad47a842dad050d6c4c3c43856" Dec 01 15:21:12 crc kubenswrapper[4931]: E1201 15:21:12.450506 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0677dddc9efbc160c5fd369586fa7c40a17a08ad47a842dad050d6c4c3c43856\": container with ID starting with 0677dddc9efbc160c5fd369586fa7c40a17a08ad47a842dad050d6c4c3c43856 not found: ID does not exist" containerID="0677dddc9efbc160c5fd369586fa7c40a17a08ad47a842dad050d6c4c3c43856" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.450549 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0677dddc9efbc160c5fd369586fa7c40a17a08ad47a842dad050d6c4c3c43856"} err="failed to get container status \"0677dddc9efbc160c5fd369586fa7c40a17a08ad47a842dad050d6c4c3c43856\": rpc error: code = NotFound desc = could not find container \"0677dddc9efbc160c5fd369586fa7c40a17a08ad47a842dad050d6c4c3c43856\": container with ID starting with 0677dddc9efbc160c5fd369586fa7c40a17a08ad47a842dad050d6c4c3c43856 not found: ID does not exist" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.554305 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2cae1c45-3e2d-4df6-93a6-b133953bdce0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.554368 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cae1c45-3e2d-4df6-93a6-b133953bdce0-scripts\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.554429 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cae1c45-3e2d-4df6-93a6-b133953bdce0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.554449 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94qnv\" (UniqueName: \"kubernetes.io/projected/2cae1c45-3e2d-4df6-93a6-b133953bdce0-kube-api-access-94qnv\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.554473 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cae1c45-3e2d-4df6-93a6-b133953bdce0-config-data\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.554550 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cae1c45-3e2d-4df6-93a6-b133953bdce0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.655764 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cae1c45-3e2d-4df6-93a6-b133953bdce0-scripts\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.656048 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cae1c45-3e2d-4df6-93a6-b133953bdce0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.656146 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94qnv\" (UniqueName: \"kubernetes.io/projected/2cae1c45-3e2d-4df6-93a6-b133953bdce0-kube-api-access-94qnv\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.656245 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cae1c45-3e2d-4df6-93a6-b133953bdce0-config-data\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.656406 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cae1c45-3e2d-4df6-93a6-b133953bdce0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.656513 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2cae1c45-3e2d-4df6-93a6-b133953bdce0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.656654 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2cae1c45-3e2d-4df6-93a6-b133953bdce0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.660927 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cae1c45-3e2d-4df6-93a6-b133953bdce0-scripts\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.661274 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cae1c45-3e2d-4df6-93a6-b133953bdce0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.661570 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cae1c45-3e2d-4df6-93a6-b133953bdce0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.663143 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cae1c45-3e2d-4df6-93a6-b133953bdce0-config-data\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.673645 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94qnv\" (UniqueName: \"kubernetes.io/projected/2cae1c45-3e2d-4df6-93a6-b133953bdce0-kube-api-access-94qnv\") pod \"cinder-scheduler-0\" (UID: \"2cae1c45-3e2d-4df6-93a6-b133953bdce0\") " pod="openstack/cinder-scheduler-0" Dec 01 15:21:12 crc kubenswrapper[4931]: I1201 15:21:12.740608 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 15:21:13 crc kubenswrapper[4931]: I1201 15:21:13.249793 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 15:21:13 crc kubenswrapper[4931]: W1201 15:21:13.256856 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cae1c45_3e2d_4df6_93a6_b133953bdce0.slice/crio-8589e18d6bd4ccd438ffeadf49f9d9de3e9778b852cae755ae21b8189379f8bf WatchSource:0}: Error finding container 8589e18d6bd4ccd438ffeadf49f9d9de3e9778b852cae755ae21b8189379f8bf: Status 404 returned error can't find the container with id 8589e18d6bd4ccd438ffeadf49f9d9de3e9778b852cae755ae21b8189379f8bf Dec 01 15:21:13 crc kubenswrapper[4931]: I1201 15:21:13.345861 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2cae1c45-3e2d-4df6-93a6-b133953bdce0","Type":"ContainerStarted","Data":"8589e18d6bd4ccd438ffeadf49f9d9de3e9778b852cae755ae21b8189379f8bf"} Dec 01 15:21:14 crc kubenswrapper[4931]: I1201 15:21:14.098563 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-2mz7d" podUID="8da5b32f-da6b-48c5-8cf3-3f1699e1774b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: i/o timeout" Dec 01 15:21:14 crc kubenswrapper[4931]: I1201 15:21:14.257970 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e63c8e3a-39a0-4a76-9754-16250b21f1dc" path="/var/lib/kubelet/pods/e63c8e3a-39a0-4a76-9754-16250b21f1dc/volumes" Dec 01 15:21:14 crc kubenswrapper[4931]: I1201 15:21:14.362197 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2cae1c45-3e2d-4df6-93a6-b133953bdce0","Type":"ContainerStarted","Data":"0e919bf404f1283177fca85510ce6755e1c1380a4f15f3512589dec8e72e7fd7"} Dec 01 15:21:15 crc kubenswrapper[4931]: I1201 15:21:15.372401 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2cae1c45-3e2d-4df6-93a6-b133953bdce0","Type":"ContainerStarted","Data":"c54c8e50d1df096648b095bf04193c2fbff62cda30a3b3abf70466bf181beed6"} Dec 01 15:21:15 crc kubenswrapper[4931]: I1201 15:21:15.411294 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.411277048 podStartE2EDuration="3.411277048s" podCreationTimestamp="2025-12-01 15:21:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:15.40565058 +0000 UTC m=+1221.831524257" watchObservedRunningTime="2025-12-01 15:21:15.411277048 +0000 UTC m=+1221.837150715" Dec 01 15:21:16 crc kubenswrapper[4931]: I1201 15:21:16.386548 4931 generic.go:334] "Generic (PLEG): container finished" podID="97ed61f3-8ca0-4aee-afae-168398babe70" containerID="63a67a4f00ff8d02071cd5c46b0692c9f484b2d61acc41dcf8b2b53a1f51fdb8" exitCode=0 Dec 01 15:21:16 crc kubenswrapper[4931]: I1201 15:21:16.386608 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6479b7c68-txrvx" event={"ID":"97ed61f3-8ca0-4aee-afae-168398babe70","Type":"ContainerDied","Data":"63a67a4f00ff8d02071cd5c46b0692c9f484b2d61acc41dcf8b2b53a1f51fdb8"} Dec 01 15:21:16 crc kubenswrapper[4931]: I1201 15:21:16.601483 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6479b7c68-txrvx" podUID="97ed61f3-8ca0-4aee-afae-168398babe70" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 01 15:21:17 crc kubenswrapper[4931]: I1201 15:21:17.741663 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 15:21:18 crc kubenswrapper[4931]: I1201 15:21:18.520631 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 01 15:21:19 crc kubenswrapper[4931]: I1201 15:21:19.612670 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7964d85c7c-w2fwr" Dec 01 15:21:19 crc kubenswrapper[4931]: I1201 15:21:19.871619 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:21:19 crc kubenswrapper[4931]: I1201 15:21:19.871670 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.174374 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.193395 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-779b959886-w78q9" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.262398 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-969f6549d-4t8ht"] Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.262600 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-969f6549d-4t8ht" podUID="0442bbe2-b143-4004-a1d8-e207e159423e" containerName="barbican-api-log" containerID="cri-o://020ab7ec7b0bb107fd755ee78a7ced6530ee7d2d1d483d142b30a61b46295d2b" gracePeriod=30 Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.262962 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-969f6549d-4t8ht" podUID="0442bbe2-b143-4004-a1d8-e207e159423e" containerName="barbican-api" containerID="cri-o://fdd9563c0cb5345cf18ebaf800ec8ff6b1c1077865c86bf1579f8fb3aa155adf" gracePeriod=30 Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.456626 4931 generic.go:334] "Generic (PLEG): container finished" podID="0442bbe2-b143-4004-a1d8-e207e159423e" containerID="020ab7ec7b0bb107fd755ee78a7ced6530ee7d2d1d483d142b30a61b46295d2b" exitCode=143 Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.457512 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-969f6549d-4t8ht" event={"ID":"0442bbe2-b143-4004-a1d8-e207e159423e","Type":"ContainerDied","Data":"020ab7ec7b0bb107fd755ee78a7ced6530ee7d2d1d483d142b30a61b46295d2b"} Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.497451 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.498704 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.502450 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-6nrnd" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.502562 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.502665 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.504046 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.516187 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca8ee92-e191-4e9b-aa91-af27342a9fb5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0ca8ee92-e191-4e9b-aa91-af27342a9fb5\") " pod="openstack/openstackclient" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.516284 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0ca8ee92-e191-4e9b-aa91-af27342a9fb5-openstack-config\") pod \"openstackclient\" (UID: \"0ca8ee92-e191-4e9b-aa91-af27342a9fb5\") " pod="openstack/openstackclient" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.516431 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0ca8ee92-e191-4e9b-aa91-af27342a9fb5-openstack-config-secret\") pod \"openstackclient\" (UID: \"0ca8ee92-e191-4e9b-aa91-af27342a9fb5\") " pod="openstack/openstackclient" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.516494 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2l4s\" (UniqueName: \"kubernetes.io/projected/0ca8ee92-e191-4e9b-aa91-af27342a9fb5-kube-api-access-j2l4s\") pod \"openstackclient\" (UID: \"0ca8ee92-e191-4e9b-aa91-af27342a9fb5\") " pod="openstack/openstackclient" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.618598 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2l4s\" (UniqueName: \"kubernetes.io/projected/0ca8ee92-e191-4e9b-aa91-af27342a9fb5-kube-api-access-j2l4s\") pod \"openstackclient\" (UID: \"0ca8ee92-e191-4e9b-aa91-af27342a9fb5\") " pod="openstack/openstackclient" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.618682 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca8ee92-e191-4e9b-aa91-af27342a9fb5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0ca8ee92-e191-4e9b-aa91-af27342a9fb5\") " pod="openstack/openstackclient" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.618757 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0ca8ee92-e191-4e9b-aa91-af27342a9fb5-openstack-config\") pod \"openstackclient\" (UID: \"0ca8ee92-e191-4e9b-aa91-af27342a9fb5\") " pod="openstack/openstackclient" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.618799 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0ca8ee92-e191-4e9b-aa91-af27342a9fb5-openstack-config-secret\") pod \"openstackclient\" (UID: \"0ca8ee92-e191-4e9b-aa91-af27342a9fb5\") " pod="openstack/openstackclient" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.620649 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0ca8ee92-e191-4e9b-aa91-af27342a9fb5-openstack-config\") pod \"openstackclient\" (UID: \"0ca8ee92-e191-4e9b-aa91-af27342a9fb5\") " pod="openstack/openstackclient" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.630131 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0ca8ee92-e191-4e9b-aa91-af27342a9fb5-openstack-config-secret\") pod \"openstackclient\" (UID: \"0ca8ee92-e191-4e9b-aa91-af27342a9fb5\") " pod="openstack/openstackclient" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.630286 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca8ee92-e191-4e9b-aa91-af27342a9fb5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0ca8ee92-e191-4e9b-aa91-af27342a9fb5\") " pod="openstack/openstackclient" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.644998 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2l4s\" (UniqueName: \"kubernetes.io/projected/0ca8ee92-e191-4e9b-aa91-af27342a9fb5-kube-api-access-j2l4s\") pod \"openstackclient\" (UID: \"0ca8ee92-e191-4e9b-aa91-af27342a9fb5\") " pod="openstack/openstackclient" Dec 01 15:21:20 crc kubenswrapper[4931]: I1201 15:21:20.825684 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 15:21:21 crc kubenswrapper[4931]: I1201 15:21:21.300501 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 15:21:21 crc kubenswrapper[4931]: I1201 15:21:21.470230 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0ca8ee92-e191-4e9b-aa91-af27342a9fb5","Type":"ContainerStarted","Data":"99cebc1dbbdc2027476542b1004b61b04c775d342c395a315b22b8ad5c0e019f"} Dec 01 15:21:22 crc kubenswrapper[4931]: I1201 15:21:22.333426 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:21:22 crc kubenswrapper[4931]: I1201 15:21:22.418982 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6c9d84d99b-fj4vs" Dec 01 15:21:22 crc kubenswrapper[4931]: I1201 15:21:22.991723 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 15:21:23 crc kubenswrapper[4931]: I1201 15:21:23.449080 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-969f6549d-4t8ht" podUID="0442bbe2-b143-4004-a1d8-e207e159423e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:47562->10.217.0.157:9311: read: connection reset by peer" Dec 01 15:21:23 crc kubenswrapper[4931]: I1201 15:21:23.449125 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-969f6549d-4t8ht" podUID="0442bbe2-b143-4004-a1d8-e207e159423e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:47564->10.217.0.157:9311: read: connection reset by peer" Dec 01 15:21:24 crc kubenswrapper[4931]: I1201 15:21:24.513747 4931 generic.go:334] "Generic (PLEG): container finished" podID="0442bbe2-b143-4004-a1d8-e207e159423e" containerID="fdd9563c0cb5345cf18ebaf800ec8ff6b1c1077865c86bf1579f8fb3aa155adf" exitCode=0 Dec 01 15:21:24 crc kubenswrapper[4931]: I1201 15:21:24.513793 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-969f6549d-4t8ht" event={"ID":"0442bbe2-b143-4004-a1d8-e207e159423e","Type":"ContainerDied","Data":"fdd9563c0cb5345cf18ebaf800ec8ff6b1c1077865c86bf1579f8fb3aa155adf"} Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.099838 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.121375 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-config-data-custom\") pod \"0442bbe2-b143-4004-a1d8-e207e159423e\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.121452 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-config-data\") pod \"0442bbe2-b143-4004-a1d8-e207e159423e\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.121616 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0442bbe2-b143-4004-a1d8-e207e159423e-logs\") pod \"0442bbe2-b143-4004-a1d8-e207e159423e\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.122045 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m49wm\" (UniqueName: \"kubernetes.io/projected/0442bbe2-b143-4004-a1d8-e207e159423e-kube-api-access-m49wm\") pod \"0442bbe2-b143-4004-a1d8-e207e159423e\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.122433 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-combined-ca-bundle\") pod \"0442bbe2-b143-4004-a1d8-e207e159423e\" (UID: \"0442bbe2-b143-4004-a1d8-e207e159423e\") " Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.127637 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0442bbe2-b143-4004-a1d8-e207e159423e-logs" (OuterVolumeSpecName: "logs") pod "0442bbe2-b143-4004-a1d8-e207e159423e" (UID: "0442bbe2-b143-4004-a1d8-e207e159423e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.135937 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0442bbe2-b143-4004-a1d8-e207e159423e-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.139531 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0442bbe2-b143-4004-a1d8-e207e159423e" (UID: "0442bbe2-b143-4004-a1d8-e207e159423e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.185658 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0442bbe2-b143-4004-a1d8-e207e159423e-kube-api-access-m49wm" (OuterVolumeSpecName: "kube-api-access-m49wm") pod "0442bbe2-b143-4004-a1d8-e207e159423e" (UID: "0442bbe2-b143-4004-a1d8-e207e159423e"). InnerVolumeSpecName "kube-api-access-m49wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.213891 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0442bbe2-b143-4004-a1d8-e207e159423e" (UID: "0442bbe2-b143-4004-a1d8-e207e159423e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.237459 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.237752 4931 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.237858 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m49wm\" (UniqueName: \"kubernetes.io/projected/0442bbe2-b143-4004-a1d8-e207e159423e-kube-api-access-m49wm\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.240591 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-config-data" (OuterVolumeSpecName: "config-data") pod "0442bbe2-b143-4004-a1d8-e207e159423e" (UID: "0442bbe2-b143-4004-a1d8-e207e159423e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.339580 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0442bbe2-b143-4004-a1d8-e207e159423e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.527287 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-969f6549d-4t8ht" event={"ID":"0442bbe2-b143-4004-a1d8-e207e159423e","Type":"ContainerDied","Data":"c9118773c361ae090075db3b713b04fd11050c41582b4703a3713fc71dc63e1e"} Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.527653 4931 scope.go:117] "RemoveContainer" containerID="fdd9563c0cb5345cf18ebaf800ec8ff6b1c1077865c86bf1579f8fb3aa155adf" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.527489 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-969f6549d-4t8ht" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.576586 4931 scope.go:117] "RemoveContainer" containerID="020ab7ec7b0bb107fd755ee78a7ced6530ee7d2d1d483d142b30a61b46295d2b" Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.581729 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-969f6549d-4t8ht"] Dec 01 15:21:25 crc kubenswrapper[4931]: I1201 15:21:25.590982 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-969f6549d-4t8ht"] Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.011940 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7465544595-sc668"] Dec 01 15:21:26 crc kubenswrapper[4931]: E1201 15:21:26.012367 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0442bbe2-b143-4004-a1d8-e207e159423e" containerName="barbican-api-log" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.012403 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0442bbe2-b143-4004-a1d8-e207e159423e" containerName="barbican-api-log" Dec 01 15:21:26 crc kubenswrapper[4931]: E1201 15:21:26.012464 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0442bbe2-b143-4004-a1d8-e207e159423e" containerName="barbican-api" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.012474 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0442bbe2-b143-4004-a1d8-e207e159423e" containerName="barbican-api" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.012685 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="0442bbe2-b143-4004-a1d8-e207e159423e" containerName="barbican-api-log" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.012725 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="0442bbe2-b143-4004-a1d8-e207e159423e" containerName="barbican-api" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.014225 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.016624 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.016807 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.016966 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.032497 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7465544595-sc668"] Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.081864 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1d2c6e-39e6-438c-98e8-be76bfa71050-run-httpd\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.081929 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5vh4\" (UniqueName: \"kubernetes.io/projected/2b1d2c6e-39e6-438c-98e8-be76bfa71050-kube-api-access-j5vh4\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.082075 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1d2c6e-39e6-438c-98e8-be76bfa71050-internal-tls-certs\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.082158 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1d2c6e-39e6-438c-98e8-be76bfa71050-log-httpd\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.082331 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b1d2c6e-39e6-438c-98e8-be76bfa71050-config-data\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.082514 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b1d2c6e-39e6-438c-98e8-be76bfa71050-etc-swift\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.082704 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1d2c6e-39e6-438c-98e8-be76bfa71050-combined-ca-bundle\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.082758 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1d2c6e-39e6-438c-98e8-be76bfa71050-public-tls-certs\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.184620 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5vh4\" (UniqueName: \"kubernetes.io/projected/2b1d2c6e-39e6-438c-98e8-be76bfa71050-kube-api-access-j5vh4\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.184683 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1d2c6e-39e6-438c-98e8-be76bfa71050-internal-tls-certs\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.184715 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1d2c6e-39e6-438c-98e8-be76bfa71050-log-httpd\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.184747 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b1d2c6e-39e6-438c-98e8-be76bfa71050-config-data\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.184782 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b1d2c6e-39e6-438c-98e8-be76bfa71050-etc-swift\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.184841 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1d2c6e-39e6-438c-98e8-be76bfa71050-combined-ca-bundle\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.184860 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1d2c6e-39e6-438c-98e8-be76bfa71050-public-tls-certs\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.184887 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1d2c6e-39e6-438c-98e8-be76bfa71050-run-httpd\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.185596 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1d2c6e-39e6-438c-98e8-be76bfa71050-run-httpd\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.189820 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1d2c6e-39e6-438c-98e8-be76bfa71050-log-httpd\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.196441 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1d2c6e-39e6-438c-98e8-be76bfa71050-public-tls-certs\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.196558 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1d2c6e-39e6-438c-98e8-be76bfa71050-internal-tls-certs\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.197153 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b1d2c6e-39e6-438c-98e8-be76bfa71050-etc-swift\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.197724 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1d2c6e-39e6-438c-98e8-be76bfa71050-combined-ca-bundle\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.198588 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b1d2c6e-39e6-438c-98e8-be76bfa71050-config-data\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.203827 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5vh4\" (UniqueName: \"kubernetes.io/projected/2b1d2c6e-39e6-438c-98e8-be76bfa71050-kube-api-access-j5vh4\") pod \"swift-proxy-7465544595-sc668\" (UID: \"2b1d2c6e-39e6-438c-98e8-be76bfa71050\") " pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.253335 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0442bbe2-b143-4004-a1d8-e207e159423e" path="/var/lib/kubelet/pods/0442bbe2-b143-4004-a1d8-e207e159423e/volumes" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.418426 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:26 crc kubenswrapper[4931]: I1201 15:21:26.600703 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6479b7c68-txrvx" podUID="97ed61f3-8ca0-4aee-afae-168398babe70" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 01 15:21:27 crc kubenswrapper[4931]: I1201 15:21:27.703150 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:21:28 crc kubenswrapper[4931]: I1201 15:21:28.105126 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:28 crc kubenswrapper[4931]: I1201 15:21:28.105409 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="ceilometer-central-agent" containerID="cri-o://9f849267533ea27449b15fbc7a1db9e8b8c2c36ed33166cb3a4ad826115f0e97" gracePeriod=30 Dec 01 15:21:28 crc kubenswrapper[4931]: I1201 15:21:28.105617 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="proxy-httpd" containerID="cri-o://d7bdef9f5f4ff27e3daa7252a373c1ac4bbaa741ae163a7c18fb9e488e289c6e" gracePeriod=30 Dec 01 15:21:28 crc kubenswrapper[4931]: I1201 15:21:28.105688 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="sg-core" containerID="cri-o://340370a8052e60d9faef18c959ef8c8ea5620847409fc6f64ab48d97e78238bc" gracePeriod=30 Dec 01 15:21:28 crc kubenswrapper[4931]: I1201 15:21:28.105696 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="ceilometer-notification-agent" containerID="cri-o://fc01f84aba9de69b5c38161a1e55ec6d3cb25f9d688143aba884db04a34e939b" gracePeriod=30 Dec 01 15:21:28 crc kubenswrapper[4931]: I1201 15:21:28.118434 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.161:3000/\": EOF" Dec 01 15:21:28 crc kubenswrapper[4931]: I1201 15:21:28.556586 4931 generic.go:334] "Generic (PLEG): container finished" podID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerID="340370a8052e60d9faef18c959ef8c8ea5620847409fc6f64ab48d97e78238bc" exitCode=2 Dec 01 15:21:28 crc kubenswrapper[4931]: I1201 15:21:28.556938 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf71c332-a4fd-4709-9465-db8a45e1e5a7","Type":"ContainerDied","Data":"340370a8052e60d9faef18c959ef8c8ea5620847409fc6f64ab48d97e78238bc"} Dec 01 15:21:29 crc kubenswrapper[4931]: I1201 15:21:29.162670 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:21:29 crc kubenswrapper[4931]: I1201 15:21:29.162946 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3a5867d7-6574-4f95-97c9-f6830600606a" containerName="glance-log" containerID="cri-o://689bccfbbada21a31d285e8774317a671d2202169bb277c5a0debcf496e821f7" gracePeriod=30 Dec 01 15:21:29 crc kubenswrapper[4931]: I1201 15:21:29.163213 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3a5867d7-6574-4f95-97c9-f6830600606a" containerName="glance-httpd" containerID="cri-o://50178a8e177ef0ae4ce84e40919d15aee6c2f140b9641179cff854b7979db0d5" gracePeriod=30 Dec 01 15:21:29 crc kubenswrapper[4931]: I1201 15:21:29.371401 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.161:3000/\": dial tcp 10.217.0.161:3000: connect: connection refused" Dec 01 15:21:29 crc kubenswrapper[4931]: I1201 15:21:29.576780 4931 generic.go:334] "Generic (PLEG): container finished" podID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerID="d7bdef9f5f4ff27e3daa7252a373c1ac4bbaa741ae163a7c18fb9e488e289c6e" exitCode=0 Dec 01 15:21:29 crc kubenswrapper[4931]: I1201 15:21:29.576816 4931 generic.go:334] "Generic (PLEG): container finished" podID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerID="fc01f84aba9de69b5c38161a1e55ec6d3cb25f9d688143aba884db04a34e939b" exitCode=0 Dec 01 15:21:29 crc kubenswrapper[4931]: I1201 15:21:29.576828 4931 generic.go:334] "Generic (PLEG): container finished" podID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerID="9f849267533ea27449b15fbc7a1db9e8b8c2c36ed33166cb3a4ad826115f0e97" exitCode=0 Dec 01 15:21:29 crc kubenswrapper[4931]: I1201 15:21:29.576881 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf71c332-a4fd-4709-9465-db8a45e1e5a7","Type":"ContainerDied","Data":"d7bdef9f5f4ff27e3daa7252a373c1ac4bbaa741ae163a7c18fb9e488e289c6e"} Dec 01 15:21:29 crc kubenswrapper[4931]: I1201 15:21:29.576911 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf71c332-a4fd-4709-9465-db8a45e1e5a7","Type":"ContainerDied","Data":"fc01f84aba9de69b5c38161a1e55ec6d3cb25f9d688143aba884db04a34e939b"} Dec 01 15:21:29 crc kubenswrapper[4931]: I1201 15:21:29.576923 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf71c332-a4fd-4709-9465-db8a45e1e5a7","Type":"ContainerDied","Data":"9f849267533ea27449b15fbc7a1db9e8b8c2c36ed33166cb3a4ad826115f0e97"} Dec 01 15:21:29 crc kubenswrapper[4931]: I1201 15:21:29.579729 4931 generic.go:334] "Generic (PLEG): container finished" podID="3a5867d7-6574-4f95-97c9-f6830600606a" containerID="689bccfbbada21a31d285e8774317a671d2202169bb277c5a0debcf496e821f7" exitCode=143 Dec 01 15:21:29 crc kubenswrapper[4931]: I1201 15:21:29.579778 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3a5867d7-6574-4f95-97c9-f6830600606a","Type":"ContainerDied","Data":"689bccfbbada21a31d285e8774317a671d2202169bb277c5a0debcf496e821f7"} Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.270721 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-pt2d4"] Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.274061 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pt2d4" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.288805 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pt2d4"] Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.376313 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-8bv8v"] Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.376735 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37513865-ccd3-4ad7-89f0-66e1f3f6b9a4-operator-scripts\") pod \"nova-api-db-create-pt2d4\" (UID: \"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4\") " pod="openstack/nova-api-db-create-pt2d4" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.376897 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb9b2\" (UniqueName: \"kubernetes.io/projected/37513865-ccd3-4ad7-89f0-66e1f3f6b9a4-kube-api-access-cb9b2\") pod \"nova-api-db-create-pt2d4\" (UID: \"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4\") " pod="openstack/nova-api-db-create-pt2d4" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.377677 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8bv8v" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.386063 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8bv8v"] Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.411458 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6966-account-create-update-9b6ww"] Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.412864 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6966-account-create-update-9b6ww" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.414834 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.432431 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6966-account-create-update-9b6ww"] Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.479822 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37513865-ccd3-4ad7-89f0-66e1f3f6b9a4-operator-scripts\") pod \"nova-api-db-create-pt2d4\" (UID: \"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4\") " pod="openstack/nova-api-db-create-pt2d4" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.479898 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j76xv\" (UniqueName: \"kubernetes.io/projected/8b99e4c2-d0b8-4206-a0cf-45f43d0557cf-kube-api-access-j76xv\") pod \"nova-cell0-db-create-8bv8v\" (UID: \"8b99e4c2-d0b8-4206-a0cf-45f43d0557cf\") " pod="openstack/nova-cell0-db-create-8bv8v" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.479919 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b99e4c2-d0b8-4206-a0cf-45f43d0557cf-operator-scripts\") pod \"nova-cell0-db-create-8bv8v\" (UID: \"8b99e4c2-d0b8-4206-a0cf-45f43d0557cf\") " pod="openstack/nova-cell0-db-create-8bv8v" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.479949 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb9b2\" (UniqueName: \"kubernetes.io/projected/37513865-ccd3-4ad7-89f0-66e1f3f6b9a4-kube-api-access-cb9b2\") pod \"nova-api-db-create-pt2d4\" (UID: \"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4\") " pod="openstack/nova-api-db-create-pt2d4" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.479984 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/680b6f89-d743-4467-b9da-68a831e24fa9-operator-scripts\") pod \"nova-api-6966-account-create-update-9b6ww\" (UID: \"680b6f89-d743-4467-b9da-68a831e24fa9\") " pod="openstack/nova-api-6966-account-create-update-9b6ww" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.480068 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bghkq\" (UniqueName: \"kubernetes.io/projected/680b6f89-d743-4467-b9da-68a831e24fa9-kube-api-access-bghkq\") pod \"nova-api-6966-account-create-update-9b6ww\" (UID: \"680b6f89-d743-4467-b9da-68a831e24fa9\") " pod="openstack/nova-api-6966-account-create-update-9b6ww" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.480865 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37513865-ccd3-4ad7-89f0-66e1f3f6b9a4-operator-scripts\") pod \"nova-api-db-create-pt2d4\" (UID: \"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4\") " pod="openstack/nova-api-db-create-pt2d4" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.509168 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-4s7mh"] Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.510574 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4s7mh" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.517104 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb9b2\" (UniqueName: \"kubernetes.io/projected/37513865-ccd3-4ad7-89f0-66e1f3f6b9a4-kube-api-access-cb9b2\") pod \"nova-api-db-create-pt2d4\" (UID: \"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4\") " pod="openstack/nova-api-db-create-pt2d4" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.525186 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4s7mh"] Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.587371 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f40fe4-164c-4aad-9644-d509c905673f-operator-scripts\") pod \"nova-cell1-db-create-4s7mh\" (UID: \"55f40fe4-164c-4aad-9644-d509c905673f\") " pod="openstack/nova-cell1-db-create-4s7mh" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.587690 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/680b6f89-d743-4467-b9da-68a831e24fa9-operator-scripts\") pod \"nova-api-6966-account-create-update-9b6ww\" (UID: \"680b6f89-d743-4467-b9da-68a831e24fa9\") " pod="openstack/nova-api-6966-account-create-update-9b6ww" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.587923 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bghkq\" (UniqueName: \"kubernetes.io/projected/680b6f89-d743-4467-b9da-68a831e24fa9-kube-api-access-bghkq\") pod \"nova-api-6966-account-create-update-9b6ww\" (UID: \"680b6f89-d743-4467-b9da-68a831e24fa9\") " pod="openstack/nova-api-6966-account-create-update-9b6ww" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.588099 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j76xv\" (UniqueName: \"kubernetes.io/projected/8b99e4c2-d0b8-4206-a0cf-45f43d0557cf-kube-api-access-j76xv\") pod \"nova-cell0-db-create-8bv8v\" (UID: \"8b99e4c2-d0b8-4206-a0cf-45f43d0557cf\") " pod="openstack/nova-cell0-db-create-8bv8v" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.597812 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b99e4c2-d0b8-4206-a0cf-45f43d0557cf-operator-scripts\") pod \"nova-cell0-db-create-8bv8v\" (UID: \"8b99e4c2-d0b8-4206-a0cf-45f43d0557cf\") " pod="openstack/nova-cell0-db-create-8bv8v" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.597964 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqjcb\" (UniqueName: \"kubernetes.io/projected/55f40fe4-164c-4aad-9644-d509c905673f-kube-api-access-mqjcb\") pod \"nova-cell1-db-create-4s7mh\" (UID: \"55f40fe4-164c-4aad-9644-d509c905673f\") " pod="openstack/nova-cell1-db-create-4s7mh" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.591832 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/680b6f89-d743-4467-b9da-68a831e24fa9-operator-scripts\") pod \"nova-api-6966-account-create-update-9b6ww\" (UID: \"680b6f89-d743-4467-b9da-68a831e24fa9\") " pod="openstack/nova-api-6966-account-create-update-9b6ww" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.592336 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pt2d4" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.599302 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b99e4c2-d0b8-4206-a0cf-45f43d0557cf-operator-scripts\") pod \"nova-cell0-db-create-8bv8v\" (UID: \"8b99e4c2-d0b8-4206-a0cf-45f43d0557cf\") " pod="openstack/nova-cell0-db-create-8bv8v" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.630984 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j76xv\" (UniqueName: \"kubernetes.io/projected/8b99e4c2-d0b8-4206-a0cf-45f43d0557cf-kube-api-access-j76xv\") pod \"nova-cell0-db-create-8bv8v\" (UID: \"8b99e4c2-d0b8-4206-a0cf-45f43d0557cf\") " pod="openstack/nova-cell0-db-create-8bv8v" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.632045 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bghkq\" (UniqueName: \"kubernetes.io/projected/680b6f89-d743-4467-b9da-68a831e24fa9-kube-api-access-bghkq\") pod \"nova-api-6966-account-create-update-9b6ww\" (UID: \"680b6f89-d743-4467-b9da-68a831e24fa9\") " pod="openstack/nova-api-6966-account-create-update-9b6ww" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.634018 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c460-account-create-update-bp94q"] Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.635316 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c460-account-create-update-bp94q" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.645995 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.699564 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f40fe4-164c-4aad-9644-d509c905673f-operator-scripts\") pod \"nova-cell1-db-create-4s7mh\" (UID: \"55f40fe4-164c-4aad-9644-d509c905673f\") " pod="openstack/nova-cell1-db-create-4s7mh" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.699930 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqjcb\" (UniqueName: \"kubernetes.io/projected/55f40fe4-164c-4aad-9644-d509c905673f-kube-api-access-mqjcb\") pod \"nova-cell1-db-create-4s7mh\" (UID: \"55f40fe4-164c-4aad-9644-d509c905673f\") " pod="openstack/nova-cell1-db-create-4s7mh" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.700550 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8bv8v" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.701144 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f40fe4-164c-4aad-9644-d509c905673f-operator-scripts\") pod \"nova-cell1-db-create-4s7mh\" (UID: \"55f40fe4-164c-4aad-9644-d509c905673f\") " pod="openstack/nova-cell1-db-create-4s7mh" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.713247 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c460-account-create-update-bp94q"] Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.724406 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqjcb\" (UniqueName: \"kubernetes.io/projected/55f40fe4-164c-4aad-9644-d509c905673f-kube-api-access-mqjcb\") pod \"nova-cell1-db-create-4s7mh\" (UID: \"55f40fe4-164c-4aad-9644-d509c905673f\") " pod="openstack/nova-cell1-db-create-4s7mh" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.737019 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6966-account-create-update-9b6ww" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.784359 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f730-account-create-update-g9cxx"] Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.785502 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f730-account-create-update-g9cxx" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.788189 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.796650 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f730-account-create-update-g9cxx"] Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.804270 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51cb7657-b8e4-43c0-b97a-633ae04b743d-operator-scripts\") pod \"nova-cell0-c460-account-create-update-bp94q\" (UID: \"51cb7657-b8e4-43c0-b97a-633ae04b743d\") " pod="openstack/nova-cell0-c460-account-create-update-bp94q" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.804435 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2jwz\" (UniqueName: \"kubernetes.io/projected/51cb7657-b8e4-43c0-b97a-633ae04b743d-kube-api-access-r2jwz\") pod \"nova-cell0-c460-account-create-update-bp94q\" (UID: \"51cb7657-b8e4-43c0-b97a-633ae04b743d\") " pod="openstack/nova-cell0-c460-account-create-update-bp94q" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.892053 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4s7mh" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.906254 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51cb7657-b8e4-43c0-b97a-633ae04b743d-operator-scripts\") pod \"nova-cell0-c460-account-create-update-bp94q\" (UID: \"51cb7657-b8e4-43c0-b97a-633ae04b743d\") " pod="openstack/nova-cell0-c460-account-create-update-bp94q" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.906371 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d8e3fe-15ab-4d9d-939d-4198c8571597-operator-scripts\") pod \"nova-cell1-f730-account-create-update-g9cxx\" (UID: \"f8d8e3fe-15ab-4d9d-939d-4198c8571597\") " pod="openstack/nova-cell1-f730-account-create-update-g9cxx" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.906410 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2jwz\" (UniqueName: \"kubernetes.io/projected/51cb7657-b8e4-43c0-b97a-633ae04b743d-kube-api-access-r2jwz\") pod \"nova-cell0-c460-account-create-update-bp94q\" (UID: \"51cb7657-b8e4-43c0-b97a-633ae04b743d\") " pod="openstack/nova-cell0-c460-account-create-update-bp94q" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.906458 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjrfm\" (UniqueName: \"kubernetes.io/projected/f8d8e3fe-15ab-4d9d-939d-4198c8571597-kube-api-access-bjrfm\") pod \"nova-cell1-f730-account-create-update-g9cxx\" (UID: \"f8d8e3fe-15ab-4d9d-939d-4198c8571597\") " pod="openstack/nova-cell1-f730-account-create-update-g9cxx" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.907070 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51cb7657-b8e4-43c0-b97a-633ae04b743d-operator-scripts\") pod \"nova-cell0-c460-account-create-update-bp94q\" (UID: \"51cb7657-b8e4-43c0-b97a-633ae04b743d\") " pod="openstack/nova-cell0-c460-account-create-update-bp94q" Dec 01 15:21:30 crc kubenswrapper[4931]: I1201 15:21:30.924852 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2jwz\" (UniqueName: \"kubernetes.io/projected/51cb7657-b8e4-43c0-b97a-633ae04b743d-kube-api-access-r2jwz\") pod \"nova-cell0-c460-account-create-update-bp94q\" (UID: \"51cb7657-b8e4-43c0-b97a-633ae04b743d\") " pod="openstack/nova-cell0-c460-account-create-update-bp94q" Dec 01 15:21:31 crc kubenswrapper[4931]: I1201 15:21:31.008539 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d8e3fe-15ab-4d9d-939d-4198c8571597-operator-scripts\") pod \"nova-cell1-f730-account-create-update-g9cxx\" (UID: \"f8d8e3fe-15ab-4d9d-939d-4198c8571597\") " pod="openstack/nova-cell1-f730-account-create-update-g9cxx" Dec 01 15:21:31 crc kubenswrapper[4931]: I1201 15:21:31.008655 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjrfm\" (UniqueName: \"kubernetes.io/projected/f8d8e3fe-15ab-4d9d-939d-4198c8571597-kube-api-access-bjrfm\") pod \"nova-cell1-f730-account-create-update-g9cxx\" (UID: \"f8d8e3fe-15ab-4d9d-939d-4198c8571597\") " pod="openstack/nova-cell1-f730-account-create-update-g9cxx" Dec 01 15:21:31 crc kubenswrapper[4931]: I1201 15:21:31.009368 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d8e3fe-15ab-4d9d-939d-4198c8571597-operator-scripts\") pod \"nova-cell1-f730-account-create-update-g9cxx\" (UID: \"f8d8e3fe-15ab-4d9d-939d-4198c8571597\") " pod="openstack/nova-cell1-f730-account-create-update-g9cxx" Dec 01 15:21:31 crc kubenswrapper[4931]: I1201 15:21:31.025002 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjrfm\" (UniqueName: \"kubernetes.io/projected/f8d8e3fe-15ab-4d9d-939d-4198c8571597-kube-api-access-bjrfm\") pod \"nova-cell1-f730-account-create-update-g9cxx\" (UID: \"f8d8e3fe-15ab-4d9d-939d-4198c8571597\") " pod="openstack/nova-cell1-f730-account-create-update-g9cxx" Dec 01 15:21:31 crc kubenswrapper[4931]: I1201 15:21:31.075905 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c460-account-create-update-bp94q" Dec 01 15:21:31 crc kubenswrapper[4931]: I1201 15:21:31.103874 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f730-account-create-update-g9cxx" Dec 01 15:21:31 crc kubenswrapper[4931]: I1201 15:21:31.189363 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:21:31 crc kubenswrapper[4931]: I1201 15:21:31.189623 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3d6c071d-fd2a-43e2-a7d2-c2499809aad0" containerName="glance-log" containerID="cri-o://2ca6ff941a63c9f5cb9c155cddf7cbc90e315a52a2ce845b9fbff5144f8e2f48" gracePeriod=30 Dec 01 15:21:31 crc kubenswrapper[4931]: I1201 15:21:31.190291 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3d6c071d-fd2a-43e2-a7d2-c2499809aad0" containerName="glance-httpd" containerID="cri-o://b27349e697d30399f8075c2cb003d90367ae0464a5d4d5c19e50e2438314d4c5" gracePeriod=30 Dec 01 15:21:31 crc kubenswrapper[4931]: I1201 15:21:31.640474 4931 generic.go:334] "Generic (PLEG): container finished" podID="3d6c071d-fd2a-43e2-a7d2-c2499809aad0" containerID="2ca6ff941a63c9f5cb9c155cddf7cbc90e315a52a2ce845b9fbff5144f8e2f48" exitCode=143 Dec 01 15:21:31 crc kubenswrapper[4931]: I1201 15:21:31.640600 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3d6c071d-fd2a-43e2-a7d2-c2499809aad0","Type":"ContainerDied","Data":"2ca6ff941a63c9f5cb9c155cddf7cbc90e315a52a2ce845b9fbff5144f8e2f48"} Dec 01 15:21:32 crc kubenswrapper[4931]: I1201 15:21:32.356221 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="3a5867d7-6574-4f95-97c9-f6830600606a" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.147:9292/healthcheck\": read tcp 10.217.0.2:54774->10.217.0.147:9292: read: connection reset by peer" Dec 01 15:21:32 crc kubenswrapper[4931]: I1201 15:21:32.356221 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="3a5867d7-6574-4f95-97c9-f6830600606a" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.147:9292/healthcheck\": read tcp 10.217.0.2:54790->10.217.0.147:9292: read: connection reset by peer" Dec 01 15:21:32 crc kubenswrapper[4931]: I1201 15:21:32.664795 4931 generic.go:334] "Generic (PLEG): container finished" podID="3a5867d7-6574-4f95-97c9-f6830600606a" containerID="50178a8e177ef0ae4ce84e40919d15aee6c2f140b9641179cff854b7979db0d5" exitCode=0 Dec 01 15:21:32 crc kubenswrapper[4931]: I1201 15:21:32.664838 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3a5867d7-6574-4f95-97c9-f6830600606a","Type":"ContainerDied","Data":"50178a8e177ef0ae4ce84e40919d15aee6c2f140b9641179cff854b7979db0d5"} Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.336537 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.460986 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-config-data\") pod \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.461049 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz99t\" (UniqueName: \"kubernetes.io/projected/cf71c332-a4fd-4709-9465-db8a45e1e5a7-kube-api-access-jz99t\") pod \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.461080 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf71c332-a4fd-4709-9465-db8a45e1e5a7-log-httpd\") pod \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.461192 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf71c332-a4fd-4709-9465-db8a45e1e5a7-run-httpd\") pod \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.461325 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-combined-ca-bundle\") pod \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.461373 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-sg-core-conf-yaml\") pod \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.461433 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-scripts\") pod \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\" (UID: \"cf71c332-a4fd-4709-9465-db8a45e1e5a7\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.463620 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf71c332-a4fd-4709-9465-db8a45e1e5a7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cf71c332-a4fd-4709-9465-db8a45e1e5a7" (UID: "cf71c332-a4fd-4709-9465-db8a45e1e5a7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.465616 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf71c332-a4fd-4709-9465-db8a45e1e5a7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cf71c332-a4fd-4709-9465-db8a45e1e5a7" (UID: "cf71c332-a4fd-4709-9465-db8a45e1e5a7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.475142 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-scripts" (OuterVolumeSpecName: "scripts") pod "cf71c332-a4fd-4709-9465-db8a45e1e5a7" (UID: "cf71c332-a4fd-4709-9465-db8a45e1e5a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.475259 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf71c332-a4fd-4709-9465-db8a45e1e5a7-kube-api-access-jz99t" (OuterVolumeSpecName: "kube-api-access-jz99t") pod "cf71c332-a4fd-4709-9465-db8a45e1e5a7" (UID: "cf71c332-a4fd-4709-9465-db8a45e1e5a7"). InnerVolumeSpecName "kube-api-access-jz99t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.560765 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cf71c332-a4fd-4709-9465-db8a45e1e5a7" (UID: "cf71c332-a4fd-4709-9465-db8a45e1e5a7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.564060 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.564086 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.564095 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz99t\" (UniqueName: \"kubernetes.io/projected/cf71c332-a4fd-4709-9465-db8a45e1e5a7-kube-api-access-jz99t\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.564104 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf71c332-a4fd-4709-9465-db8a45e1e5a7-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.564112 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf71c332-a4fd-4709-9465-db8a45e1e5a7-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.658886 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf71c332-a4fd-4709-9465-db8a45e1e5a7" (UID: "cf71c332-a4fd-4709-9465-db8a45e1e5a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.672528 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.711593 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf71c332-a4fd-4709-9465-db8a45e1e5a7","Type":"ContainerDied","Data":"33abc3552173454383028cea5dc9f7d95e5b01a3adce90e411a0f399ff3dc026"} Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.711646 4931 scope.go:117] "RemoveContainer" containerID="d7bdef9f5f4ff27e3daa7252a373c1ac4bbaa741ae163a7c18fb9e488e289c6e" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.711788 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.727204 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0ca8ee92-e191-4e9b-aa91-af27342a9fb5","Type":"ContainerStarted","Data":"bd1025f79c73931c87a4ff8a6658c75f1e285c094f2fe44721ac5a829cf23f39"} Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.768170 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-config-data" (OuterVolumeSpecName: "config-data") pod "cf71c332-a4fd-4709-9465-db8a45e1e5a7" (UID: "cf71c332-a4fd-4709-9465-db8a45e1e5a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.778744 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf71c332-a4fd-4709-9465-db8a45e1e5a7-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.883071 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.917748 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.2937971790000002 podStartE2EDuration="13.917728082s" podCreationTimestamp="2025-12-01 15:21:20 +0000 UTC" firstStartedPulling="2025-12-01 15:21:21.305878917 +0000 UTC m=+1227.731752584" lastFinishedPulling="2025-12-01 15:21:32.92980982 +0000 UTC m=+1239.355683487" observedRunningTime="2025-12-01 15:21:33.773182389 +0000 UTC m=+1240.199056076" watchObservedRunningTime="2025-12-01 15:21:33.917728082 +0000 UTC m=+1240.343601749" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.941134 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f86b4c4b5-zs55r" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.982718 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a5867d7-6574-4f95-97c9-f6830600606a-logs\") pod \"3a5867d7-6574-4f95-97c9-f6830600606a\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.982778 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-combined-ca-bundle\") pod \"3a5867d7-6574-4f95-97c9-f6830600606a\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.982881 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-config-data\") pod \"3a5867d7-6574-4f95-97c9-f6830600606a\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.982914 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a5867d7-6574-4f95-97c9-f6830600606a-httpd-run\") pod \"3a5867d7-6574-4f95-97c9-f6830600606a\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.982946 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-internal-tls-certs\") pod \"3a5867d7-6574-4f95-97c9-f6830600606a\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.983050 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"3a5867d7-6574-4f95-97c9-f6830600606a\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.983076 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-scripts\") pod \"3a5867d7-6574-4f95-97c9-f6830600606a\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.983118 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94rwl\" (UniqueName: \"kubernetes.io/projected/3a5867d7-6574-4f95-97c9-f6830600606a-kube-api-access-94rwl\") pod \"3a5867d7-6574-4f95-97c9-f6830600606a\" (UID: \"3a5867d7-6574-4f95-97c9-f6830600606a\") " Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.988235 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a5867d7-6574-4f95-97c9-f6830600606a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3a5867d7-6574-4f95-97c9-f6830600606a" (UID: "3a5867d7-6574-4f95-97c9-f6830600606a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:21:33 crc kubenswrapper[4931]: I1201 15:21:33.988452 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a5867d7-6574-4f95-97c9-f6830600606a-logs" (OuterVolumeSpecName: "logs") pod "3a5867d7-6574-4f95-97c9-f6830600606a" (UID: "3a5867d7-6574-4f95-97c9-f6830600606a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.011241 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c460-account-create-update-bp94q"] Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.034536 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6cf859c4fb-g2pzn"] Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.034797 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6cf859c4fb-g2pzn" podUID="648f1c18-1467-4f3c-9ec3-8e1289c57a4f" containerName="neutron-api" containerID="cri-o://4d33ae8b0efc7fff609af493e5981b6538b653290a86dc610c8a12ad0a6c98c9" gracePeriod=30 Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.034938 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6cf859c4fb-g2pzn" podUID="648f1c18-1467-4f3c-9ec3-8e1289c57a4f" containerName="neutron-httpd" containerID="cri-o://c624b834b4c99846acbfd0f012a6bcd0efadaca9bd00aea534cdf91fea2fd53a" gracePeriod=30 Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.055489 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.069163 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.079089 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:34 crc kubenswrapper[4931]: E1201 15:21:34.079455 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="ceilometer-notification-agent" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.079473 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="ceilometer-notification-agent" Dec 01 15:21:34 crc kubenswrapper[4931]: E1201 15:21:34.079495 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="sg-core" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.079504 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="sg-core" Dec 01 15:21:34 crc kubenswrapper[4931]: E1201 15:21:34.079517 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="ceilometer-central-agent" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.079523 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="ceilometer-central-agent" Dec 01 15:21:34 crc kubenswrapper[4931]: E1201 15:21:34.079539 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5867d7-6574-4f95-97c9-f6830600606a" containerName="glance-httpd" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.079545 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5867d7-6574-4f95-97c9-f6830600606a" containerName="glance-httpd" Dec 01 15:21:34 crc kubenswrapper[4931]: E1201 15:21:34.079560 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5867d7-6574-4f95-97c9-f6830600606a" containerName="glance-log" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.079567 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5867d7-6574-4f95-97c9-f6830600606a" containerName="glance-log" Dec 01 15:21:34 crc kubenswrapper[4931]: E1201 15:21:34.079582 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="proxy-httpd" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.079587 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="proxy-httpd" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.079754 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="sg-core" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.079766 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="ceilometer-notification-agent" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.079783 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="proxy-httpd" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.079794 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" containerName="ceilometer-central-agent" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.079808 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5867d7-6574-4f95-97c9-f6830600606a" containerName="glance-httpd" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.079823 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5867d7-6574-4f95-97c9-f6830600606a" containerName="glance-log" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.081349 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.086141 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a5867d7-6574-4f95-97c9-f6830600606a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.086155 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a5867d7-6574-4f95-97c9-f6830600606a-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.088274 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.088437 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.097625 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.163910 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "3a5867d7-6574-4f95-97c9-f6830600606a" (UID: "3a5867d7-6574-4f95-97c9-f6830600606a"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.163991 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a5867d7-6574-4f95-97c9-f6830600606a-kube-api-access-94rwl" (OuterVolumeSpecName: "kube-api-access-94rwl") pod "3a5867d7-6574-4f95-97c9-f6830600606a" (UID: "3a5867d7-6574-4f95-97c9-f6830600606a"). InnerVolumeSpecName "kube-api-access-94rwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.164075 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-scripts" (OuterVolumeSpecName: "scripts") pod "3a5867d7-6574-4f95-97c9-f6830600606a" (UID: "3a5867d7-6574-4f95-97c9-f6830600606a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.171612 4931 scope.go:117] "RemoveContainer" containerID="340370a8052e60d9faef18c959ef8c8ea5620847409fc6f64ab48d97e78238bc" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.179567 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a5867d7-6574-4f95-97c9-f6830600606a" (UID: "3a5867d7-6574-4f95-97c9-f6830600606a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.205113 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.205174 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.205313 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bae099e9-0e3a-440d-a3fb-c993aacb5014-log-httpd\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.205339 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-config-data\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.205371 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-scripts\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.205459 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj75k\" (UniqueName: \"kubernetes.io/projected/bae099e9-0e3a-440d-a3fb-c993aacb5014-kube-api-access-bj75k\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.205503 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bae099e9-0e3a-440d-a3fb-c993aacb5014-run-httpd\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.205562 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.205575 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.205584 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94rwl\" (UniqueName: \"kubernetes.io/projected/3a5867d7-6574-4f95-97c9-f6830600606a-kube-api-access-94rwl\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.205594 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.209915 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-config-data" (OuterVolumeSpecName: "config-data") pod "3a5867d7-6574-4f95-97c9-f6830600606a" (UID: "3a5867d7-6574-4f95-97c9-f6830600606a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.216817 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3a5867d7-6574-4f95-97c9-f6830600606a" (UID: "3a5867d7-6574-4f95-97c9-f6830600606a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.223977 4931 scope.go:117] "RemoveContainer" containerID="fc01f84aba9de69b5c38161a1e55ec6d3cb25f9d688143aba884db04a34e939b" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.257076 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf71c332-a4fd-4709-9465-db8a45e1e5a7" path="/var/lib/kubelet/pods/cf71c332-a4fd-4709-9465-db8a45e1e5a7/volumes" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.279514 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.307084 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-scripts\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.307168 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj75k\" (UniqueName: \"kubernetes.io/projected/bae099e9-0e3a-440d-a3fb-c993aacb5014-kube-api-access-bj75k\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.307201 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bae099e9-0e3a-440d-a3fb-c993aacb5014-run-httpd\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.307230 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.307255 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.307324 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bae099e9-0e3a-440d-a3fb-c993aacb5014-log-httpd\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.307346 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-config-data\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.307478 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.307496 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.307510 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a5867d7-6574-4f95-97c9-f6830600606a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.308352 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bae099e9-0e3a-440d-a3fb-c993aacb5014-run-httpd\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.311676 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.311938 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bae099e9-0e3a-440d-a3fb-c993aacb5014-log-httpd\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.318573 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.319831 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-config-data\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.332318 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-scripts\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.350242 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj75k\" (UniqueName: \"kubernetes.io/projected/bae099e9-0e3a-440d-a3fb-c993aacb5014-kube-api-access-bj75k\") pod \"ceilometer-0\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.352144 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pt2d4"] Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.360316 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4s7mh"] Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.368509 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8bv8v"] Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.386619 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6966-account-create-update-9b6ww"] Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.386749 4931 scope.go:117] "RemoveContainer" containerID="9f849267533ea27449b15fbc7a1db9e8b8c2c36ed33166cb3a4ad826115f0e97" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.395815 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f730-account-create-update-g9cxx"] Dec 01 15:21:34 crc kubenswrapper[4931]: W1201 15:21:34.415606 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37513865_ccd3_4ad7_89f0_66e1f3f6b9a4.slice/crio-38987410686d3645ab4e64b3854967ac7d5aa20c3aa63443a01ce22bface722c WatchSource:0}: Error finding container 38987410686d3645ab4e64b3854967ac7d5aa20c3aa63443a01ce22bface722c: Status 404 returned error can't find the container with id 38987410686d3645ab4e64b3854967ac7d5aa20c3aa63443a01ce22bface722c Dec 01 15:21:34 crc kubenswrapper[4931]: W1201 15:21:34.418002 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b99e4c2_d0b8_4206_a0cf_45f43d0557cf.slice/crio-64d997684ca60639714bc4656b81e26748bae91b601b394ceb549d7a9b1e36b8 WatchSource:0}: Error finding container 64d997684ca60639714bc4656b81e26748bae91b601b394ceb549d7a9b1e36b8: Status 404 returned error can't find the container with id 64d997684ca60639714bc4656b81e26748bae91b601b394ceb549d7a9b1e36b8 Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.453168 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7465544595-sc668"] Dec 01 15:21:34 crc kubenswrapper[4931]: W1201 15:21:34.458316 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8d8e3fe_15ab_4d9d_939d_4198c8571597.slice/crio-dbc12bf57bae114991b0c44a2ed66de75357e30f5e82b4e081ea2e4ceffd6de8 WatchSource:0}: Error finding container dbc12bf57bae114991b0c44a2ed66de75357e30f5e82b4e081ea2e4ceffd6de8: Status 404 returned error can't find the container with id dbc12bf57bae114991b0c44a2ed66de75357e30f5e82b4e081ea2e4ceffd6de8 Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.490822 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.750051 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f730-account-create-update-g9cxx" event={"ID":"f8d8e3fe-15ab-4d9d-939d-4198c8571597","Type":"ContainerStarted","Data":"dbc12bf57bae114991b0c44a2ed66de75357e30f5e82b4e081ea2e4ceffd6de8"} Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.811136 4931 generic.go:334] "Generic (PLEG): container finished" podID="3d6c071d-fd2a-43e2-a7d2-c2499809aad0" containerID="b27349e697d30399f8075c2cb003d90367ae0464a5d4d5c19e50e2438314d4c5" exitCode=0 Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.811479 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3d6c071d-fd2a-43e2-a7d2-c2499809aad0","Type":"ContainerDied","Data":"b27349e697d30399f8075c2cb003d90367ae0464a5d4d5c19e50e2438314d4c5"} Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.822542 4931 generic.go:334] "Generic (PLEG): container finished" podID="648f1c18-1467-4f3c-9ec3-8e1289c57a4f" containerID="c624b834b4c99846acbfd0f012a6bcd0efadaca9bd00aea534cdf91fea2fd53a" exitCode=0 Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.822589 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cf859c4fb-g2pzn" event={"ID":"648f1c18-1467-4f3c-9ec3-8e1289c57a4f","Type":"ContainerDied","Data":"c624b834b4c99846acbfd0f012a6bcd0efadaca9bd00aea534cdf91fea2fd53a"} Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.824535 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pt2d4" event={"ID":"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4","Type":"ContainerStarted","Data":"38987410686d3645ab4e64b3854967ac7d5aa20c3aa63443a01ce22bface722c"} Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.847452 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8bv8v" event={"ID":"8b99e4c2-d0b8-4206-a0cf-45f43d0557cf","Type":"ContainerStarted","Data":"64d997684ca60639714bc4656b81e26748bae91b601b394ceb549d7a9b1e36b8"} Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.852345 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c460-account-create-update-bp94q" event={"ID":"51cb7657-b8e4-43c0-b97a-633ae04b743d","Type":"ContainerStarted","Data":"78d7e0fc026beed159aacf1e748a609d7a96ba0569f8978e00d276915fac7a97"} Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.852419 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c460-account-create-update-bp94q" event={"ID":"51cb7657-b8e4-43c0-b97a-633ae04b743d","Type":"ContainerStarted","Data":"39f1a21e41df2b4a89c6ee6c54e17add3c638337f77e0a892f726a72c9651277"} Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.855997 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6966-account-create-update-9b6ww" event={"ID":"680b6f89-d743-4467-b9da-68a831e24fa9","Type":"ContainerStarted","Data":"1c9643bd4d5cf7ea0ef6cff85698217372bc7d33fc809c6505829bd112b2839f"} Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.864528 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7465544595-sc668" event={"ID":"2b1d2c6e-39e6-438c-98e8-be76bfa71050","Type":"ContainerStarted","Data":"3cc127901b2cda1cba6d323533e1d1e1dbb31307f37f4e1086463655b541e17e"} Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.874548 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-c460-account-create-update-bp94q" podStartSLOduration=4.874523782 podStartE2EDuration="4.874523782s" podCreationTimestamp="2025-12-01 15:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:34.869540322 +0000 UTC m=+1241.295413989" watchObservedRunningTime="2025-12-01 15:21:34.874523782 +0000 UTC m=+1241.300397449" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.912458 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4s7mh" event={"ID":"55f40fe4-164c-4aad-9644-d509c905673f","Type":"ContainerStarted","Data":"4d2288a85f149289c5ca254d9f77bf8cfa89fd5a7e6b6a026bce7676e3eaa3d9"} Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.944253 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.944568 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3a5867d7-6574-4f95-97c9-f6830600606a","Type":"ContainerDied","Data":"4e4b271046ada8760b61be95ca12599de001ebc652731a37b23a07403e6422f1"} Dec 01 15:21:34 crc kubenswrapper[4931]: I1201 15:21:34.944653 4931 scope.go:117] "RemoveContainer" containerID="50178a8e177ef0ae4ce84e40919d15aee6c2f140b9641179cff854b7979db0d5" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.237854 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.289618 4931 scope.go:117] "RemoveContainer" containerID="689bccfbbada21a31d285e8774317a671d2202169bb277c5a0debcf496e821f7" Dec 01 15:21:35 crc kubenswrapper[4931]: W1201 15:21:35.342096 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbae099e9_0e3a_440d_a3fb_c993aacb5014.slice/crio-f0e86c11a7c3352aed25f38e28920d2ee0f77663e56cd12e2a0fd5edde71a763 WatchSource:0}: Error finding container f0e86c11a7c3352aed25f38e28920d2ee0f77663e56cd12e2a0fd5edde71a763: Status 404 returned error can't find the container with id f0e86c11a7c3352aed25f38e28920d2ee0f77663e56cd12e2a0fd5edde71a763 Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.418428 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.533820 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.541011 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-httpd-run\") pod \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.541090 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-scripts\") pod \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.541162 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.541222 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-logs\") pod \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.541259 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72bcf\" (UniqueName: \"kubernetes.io/projected/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-kube-api-access-72bcf\") pod \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.541311 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-combined-ca-bundle\") pod \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.541415 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-public-tls-certs\") pod \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.541471 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-config-data\") pod \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\" (UID: \"3d6c071d-fd2a-43e2-a7d2-c2499809aad0\") " Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.547699 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-logs" (OuterVolumeSpecName: "logs") pod "3d6c071d-fd2a-43e2-a7d2-c2499809aad0" (UID: "3d6c071d-fd2a-43e2-a7d2-c2499809aad0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.548320 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3d6c071d-fd2a-43e2-a7d2-c2499809aad0" (UID: "3d6c071d-fd2a-43e2-a7d2-c2499809aad0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.560890 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-scripts" (OuterVolumeSpecName: "scripts") pod "3d6c071d-fd2a-43e2-a7d2-c2499809aad0" (UID: "3d6c071d-fd2a-43e2-a7d2-c2499809aad0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.562202 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.604503 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:21:35 crc kubenswrapper[4931]: E1201 15:21:35.605578 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6c071d-fd2a-43e2-a7d2-c2499809aad0" containerName="glance-httpd" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.605593 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6c071d-fd2a-43e2-a7d2-c2499809aad0" containerName="glance-httpd" Dec 01 15:21:35 crc kubenswrapper[4931]: E1201 15:21:35.605622 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6c071d-fd2a-43e2-a7d2-c2499809aad0" containerName="glance-log" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.605628 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6c071d-fd2a-43e2-a7d2-c2499809aad0" containerName="glance-log" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.605797 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6c071d-fd2a-43e2-a7d2-c2499809aad0" containerName="glance-log" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.605811 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6c071d-fd2a-43e2-a7d2-c2499809aad0" containerName="glance-httpd" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.606730 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.611990 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.612178 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.614342 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.615507 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "3d6c071d-fd2a-43e2-a7d2-c2499809aad0" (UID: "3d6c071d-fd2a-43e2-a7d2-c2499809aad0"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.615559 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-kube-api-access-72bcf" (OuterVolumeSpecName: "kube-api-access-72bcf") pod "3d6c071d-fd2a-43e2-a7d2-c2499809aad0" (UID: "3d6c071d-fd2a-43e2-a7d2-c2499809aad0"). InnerVolumeSpecName "kube-api-access-72bcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.646656 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.646691 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.646720 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.646737 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.646750 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72bcf\" (UniqueName: \"kubernetes.io/projected/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-kube-api-access-72bcf\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.748731 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab67b9e9-4315-4390-b414-89b215ad823b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.749062 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab67b9e9-4315-4390-b414-89b215ad823b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.749215 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab67b9e9-4315-4390-b414-89b215ad823b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.749420 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab67b9e9-4315-4390-b414-89b215ad823b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.749518 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab67b9e9-4315-4390-b414-89b215ad823b-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.749607 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfsrz\" (UniqueName: \"kubernetes.io/projected/ab67b9e9-4315-4390-b414-89b215ad823b-kube-api-access-wfsrz\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.749702 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.749776 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab67b9e9-4315-4390-b414-89b215ad823b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.812755 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.848135 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3d6c071d-fd2a-43e2-a7d2-c2499809aad0" (UID: "3d6c071d-fd2a-43e2-a7d2-c2499809aad0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.853326 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab67b9e9-4315-4390-b414-89b215ad823b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.853535 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab67b9e9-4315-4390-b414-89b215ad823b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.853657 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab67b9e9-4315-4390-b414-89b215ad823b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.853812 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab67b9e9-4315-4390-b414-89b215ad823b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.853842 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab67b9e9-4315-4390-b414-89b215ad823b-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.853912 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfsrz\" (UniqueName: \"kubernetes.io/projected/ab67b9e9-4315-4390-b414-89b215ad823b-kube-api-access-wfsrz\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.853972 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.854010 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab67b9e9-4315-4390-b414-89b215ad823b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.854070 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.854083 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.854513 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab67b9e9-4315-4390-b414-89b215ad823b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.855810 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.855827 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab67b9e9-4315-4390-b414-89b215ad823b-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.881276 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab67b9e9-4315-4390-b414-89b215ad823b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.882146 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab67b9e9-4315-4390-b414-89b215ad823b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.884505 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab67b9e9-4315-4390-b414-89b215ad823b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.890032 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab67b9e9-4315-4390-b414-89b215ad823b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.908624 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfsrz\" (UniqueName: \"kubernetes.io/projected/ab67b9e9-4315-4390-b414-89b215ad823b-kube-api-access-wfsrz\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.941513 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-config-data" (OuterVolumeSpecName: "config-data") pod "3d6c071d-fd2a-43e2-a7d2-c2499809aad0" (UID: "3d6c071d-fd2a-43e2-a7d2-c2499809aad0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.953328 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ab67b9e9-4315-4390-b414-89b215ad823b\") " pod="openstack/glance-default-internal-api-0" Dec 01 15:21:35 crc kubenswrapper[4931]: I1201 15:21:35.956666 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.045811 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3d6c071d-fd2a-43e2-a7d2-c2499809aad0","Type":"ContainerDied","Data":"f5ec4d93245a42904cae955178a61b7393926f8549f4199a5730546bc58a8e52"} Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.045862 4931 scope.go:117] "RemoveContainer" containerID="b27349e697d30399f8075c2cb003d90367ae0464a5d4d5c19e50e2438314d4c5" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.045984 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.062619 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d6c071d-fd2a-43e2-a7d2-c2499809aad0" (UID: "3d6c071d-fd2a-43e2-a7d2-c2499809aad0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.072662 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.094359 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7465544595-sc668" event={"ID":"2b1d2c6e-39e6-438c-98e8-be76bfa71050","Type":"ContainerStarted","Data":"edd78cc6234b80aeeac4579b68ad97e342b115caca416e92e35a9a4a9b056ee4"} Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.098488 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bae099e9-0e3a-440d-a3fb-c993aacb5014","Type":"ContainerStarted","Data":"f0e86c11a7c3352aed25f38e28920d2ee0f77663e56cd12e2a0fd5edde71a763"} Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.123065 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4s7mh" event={"ID":"55f40fe4-164c-4aad-9644-d509c905673f","Type":"ContainerStarted","Data":"051bac8998ca5294a694a5c6e7d55760e12b3361336bb9ed30ffe90520719374"} Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.133823 4931 scope.go:117] "RemoveContainer" containerID="2ca6ff941a63c9f5cb9c155cddf7cbc90e315a52a2ce845b9fbff5144f8e2f48" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.137243 4931 generic.go:334] "Generic (PLEG): container finished" podID="8b99e4c2-d0b8-4206-a0cf-45f43d0557cf" containerID="719334c10940453f4f301adda937fa871eaa85b5cdb43cb6abbd70fd672d84a7" exitCode=0 Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.137309 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8bv8v" event={"ID":"8b99e4c2-d0b8-4206-a0cf-45f43d0557cf","Type":"ContainerDied","Data":"719334c10940453f4f301adda937fa871eaa85b5cdb43cb6abbd70fd672d84a7"} Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.154876 4931 generic.go:334] "Generic (PLEG): container finished" podID="51cb7657-b8e4-43c0-b97a-633ae04b743d" containerID="78d7e0fc026beed159aacf1e748a609d7a96ba0569f8978e00d276915fac7a97" exitCode=0 Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.154957 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c460-account-create-update-bp94q" event={"ID":"51cb7657-b8e4-43c0-b97a-633ae04b743d","Type":"ContainerDied","Data":"78d7e0fc026beed159aacf1e748a609d7a96ba0569f8978e00d276915fac7a97"} Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.162201 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6c071d-fd2a-43e2-a7d2-c2499809aad0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.164035 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6966-account-create-update-9b6ww" event={"ID":"680b6f89-d743-4467-b9da-68a831e24fa9","Type":"ContainerStarted","Data":"e4484967bafb645eb818b6fb9fd830cfe8e8c6470521ebe5a02ee0610bdae3da"} Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.178757 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f730-account-create-update-g9cxx" event={"ID":"f8d8e3fe-15ab-4d9d-939d-4198c8571597","Type":"ContainerStarted","Data":"c70d69ad7753d80d60411aa595bb9f7b0851229918227cbb1c8d9e8a83744c6e"} Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.183888 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-4s7mh" podStartSLOduration=6.183864286 podStartE2EDuration="6.183864286s" podCreationTimestamp="2025-12-01 15:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:36.160862671 +0000 UTC m=+1242.586736338" watchObservedRunningTime="2025-12-01 15:21:36.183864286 +0000 UTC m=+1242.609737953" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.210738 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pt2d4" event={"ID":"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4","Type":"ContainerStarted","Data":"e3cd6179438f9e93cc521c11ddd66e62b79fc89d780242d2b02cdbbed8005172"} Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.293301 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-pt2d4" podStartSLOduration=6.293279254 podStartE2EDuration="6.293279254s" podCreationTimestamp="2025-12-01 15:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:36.265276159 +0000 UTC m=+1242.691149826" watchObservedRunningTime="2025-12-01 15:21:36.293279254 +0000 UTC m=+1242.719152921" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.297257 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-f730-account-create-update-g9cxx" podStartSLOduration=6.297241435 podStartE2EDuration="6.297241435s" podCreationTimestamp="2025-12-01 15:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:36.282898653 +0000 UTC m=+1242.708772320" watchObservedRunningTime="2025-12-01 15:21:36.297241435 +0000 UTC m=+1242.723115102" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.310298 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-6966-account-create-update-9b6ww" podStartSLOduration=6.310279371 podStartE2EDuration="6.310279371s" podCreationTimestamp="2025-12-01 15:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:36.297604365 +0000 UTC m=+1242.723478032" watchObservedRunningTime="2025-12-01 15:21:36.310279371 +0000 UTC m=+1242.736153038" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.320635 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a5867d7-6574-4f95-97c9-f6830600606a" path="/var/lib/kubelet/pods/3a5867d7-6574-4f95-97c9-f6830600606a/volumes" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.390200 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.413518 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.420351 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.422228 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.426854 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.427038 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.445439 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.582643 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2316f8ea-3789-4702-91c2-a44da618bb8d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.582699 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2316f8ea-3789-4702-91c2-a44da618bb8d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.582738 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2316f8ea-3789-4702-91c2-a44da618bb8d-logs\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.582763 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2316f8ea-3789-4702-91c2-a44da618bb8d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.582786 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2316f8ea-3789-4702-91c2-a44da618bb8d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.582815 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k75dl\" (UniqueName: \"kubernetes.io/projected/2316f8ea-3789-4702-91c2-a44da618bb8d-kube-api-access-k75dl\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.582860 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.582877 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2316f8ea-3789-4702-91c2-a44da618bb8d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.602278 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6479b7c68-txrvx" podUID="97ed61f3-8ca0-4aee-afae-168398babe70" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.602412 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.684101 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2316f8ea-3789-4702-91c2-a44da618bb8d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.684158 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2316f8ea-3789-4702-91c2-a44da618bb8d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.684210 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2316f8ea-3789-4702-91c2-a44da618bb8d-logs\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.684252 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2316f8ea-3789-4702-91c2-a44da618bb8d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.684284 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2316f8ea-3789-4702-91c2-a44da618bb8d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.684322 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k75dl\" (UniqueName: \"kubernetes.io/projected/2316f8ea-3789-4702-91c2-a44da618bb8d-kube-api-access-k75dl\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.684373 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.684416 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2316f8ea-3789-4702-91c2-a44da618bb8d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.684835 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.685349 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2316f8ea-3789-4702-91c2-a44da618bb8d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.688874 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2316f8ea-3789-4702-91c2-a44da618bb8d-logs\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.690646 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2316f8ea-3789-4702-91c2-a44da618bb8d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.693107 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2316f8ea-3789-4702-91c2-a44da618bb8d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.693174 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2316f8ea-3789-4702-91c2-a44da618bb8d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.694202 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2316f8ea-3789-4702-91c2-a44da618bb8d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.711314 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k75dl\" (UniqueName: \"kubernetes.io/projected/2316f8ea-3789-4702-91c2-a44da618bb8d-kube-api-access-k75dl\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.716671 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"2316f8ea-3789-4702-91c2-a44da618bb8d\") " pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.817664 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 15:21:36 crc kubenswrapper[4931]: I1201 15:21:36.972370 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 15:21:36 crc kubenswrapper[4931]: W1201 15:21:36.999006 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab67b9e9_4315_4390_b414_89b215ad823b.slice/crio-7b48b481686f0344d26a92df2e75f2255a405b7e8c3bee6dc10eb629d233c2d1 WatchSource:0}: Error finding container 7b48b481686f0344d26a92df2e75f2255a405b7e8c3bee6dc10eb629d233c2d1: Status 404 returned error can't find the container with id 7b48b481686f0344d26a92df2e75f2255a405b7e8c3bee6dc10eb629d233c2d1 Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.228285 4931 generic.go:334] "Generic (PLEG): container finished" podID="680b6f89-d743-4467-b9da-68a831e24fa9" containerID="e4484967bafb645eb818b6fb9fd830cfe8e8c6470521ebe5a02ee0610bdae3da" exitCode=0 Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.228378 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6966-account-create-update-9b6ww" event={"ID":"680b6f89-d743-4467-b9da-68a831e24fa9","Type":"ContainerDied","Data":"e4484967bafb645eb818b6fb9fd830cfe8e8c6470521ebe5a02ee0610bdae3da"} Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.232721 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7465544595-sc668" event={"ID":"2b1d2c6e-39e6-438c-98e8-be76bfa71050","Type":"ContainerStarted","Data":"7c823c51d557a4e0ac10df8cf269b877722b64c44a2cb9f9ac915425e96a139b"} Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.232972 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.233020 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.235355 4931 generic.go:334] "Generic (PLEG): container finished" podID="f8d8e3fe-15ab-4d9d-939d-4198c8571597" containerID="c70d69ad7753d80d60411aa595bb9f7b0851229918227cbb1c8d9e8a83744c6e" exitCode=0 Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.235450 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f730-account-create-update-g9cxx" event={"ID":"f8d8e3fe-15ab-4d9d-939d-4198c8571597","Type":"ContainerDied","Data":"c70d69ad7753d80d60411aa595bb9f7b0851229918227cbb1c8d9e8a83744c6e"} Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.242355 4931 generic.go:334] "Generic (PLEG): container finished" podID="37513865-ccd3-4ad7-89f0-66e1f3f6b9a4" containerID="e3cd6179438f9e93cc521c11ddd66e62b79fc89d780242d2b02cdbbed8005172" exitCode=0 Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.242484 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pt2d4" event={"ID":"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4","Type":"ContainerDied","Data":"e3cd6179438f9e93cc521c11ddd66e62b79fc89d780242d2b02cdbbed8005172"} Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.249133 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bae099e9-0e3a-440d-a3fb-c993aacb5014","Type":"ContainerStarted","Data":"f4ecd2a717bf16aff9335d90c55b4a62cebbfa7e4f9c525e8f12d3f165ed11ac"} Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.251710 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab67b9e9-4315-4390-b414-89b215ad823b","Type":"ContainerStarted","Data":"7b48b481686f0344d26a92df2e75f2255a405b7e8c3bee6dc10eb629d233c2d1"} Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.254479 4931 generic.go:334] "Generic (PLEG): container finished" podID="55f40fe4-164c-4aad-9644-d509c905673f" containerID="051bac8998ca5294a694a5c6e7d55760e12b3361336bb9ed30ffe90520719374" exitCode=0 Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.254677 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4s7mh" event={"ID":"55f40fe4-164c-4aad-9644-d509c905673f","Type":"ContainerDied","Data":"051bac8998ca5294a694a5c6e7d55760e12b3361336bb9ed30ffe90520719374"} Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.284695 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7465544595-sc668" podStartSLOduration=12.284668694 podStartE2EDuration="12.284668694s" podCreationTimestamp="2025-12-01 15:21:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:37.277642546 +0000 UTC m=+1243.703516224" watchObservedRunningTime="2025-12-01 15:21:37.284668694 +0000 UTC m=+1243.710542361" Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.454077 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.839199 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8bv8v" Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.934757 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j76xv\" (UniqueName: \"kubernetes.io/projected/8b99e4c2-d0b8-4206-a0cf-45f43d0557cf-kube-api-access-j76xv\") pod \"8b99e4c2-d0b8-4206-a0cf-45f43d0557cf\" (UID: \"8b99e4c2-d0b8-4206-a0cf-45f43d0557cf\") " Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.941756 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b99e4c2-d0b8-4206-a0cf-45f43d0557cf-operator-scripts\") pod \"8b99e4c2-d0b8-4206-a0cf-45f43d0557cf\" (UID: \"8b99e4c2-d0b8-4206-a0cf-45f43d0557cf\") " Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.967170 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b99e4c2-d0b8-4206-a0cf-45f43d0557cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b99e4c2-d0b8-4206-a0cf-45f43d0557cf" (UID: "8b99e4c2-d0b8-4206-a0cf-45f43d0557cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:37 crc kubenswrapper[4931]: I1201 15:21:37.984776 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b99e4c2-d0b8-4206-a0cf-45f43d0557cf-kube-api-access-j76xv" (OuterVolumeSpecName: "kube-api-access-j76xv") pod "8b99e4c2-d0b8-4206-a0cf-45f43d0557cf" (UID: "8b99e4c2-d0b8-4206-a0cf-45f43d0557cf"). InnerVolumeSpecName "kube-api-access-j76xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.045948 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b99e4c2-d0b8-4206-a0cf-45f43d0557cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.045980 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j76xv\" (UniqueName: \"kubernetes.io/projected/8b99e4c2-d0b8-4206-a0cf-45f43d0557cf-kube-api-access-j76xv\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.050016 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c460-account-create-update-bp94q" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.149579 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2jwz\" (UniqueName: \"kubernetes.io/projected/51cb7657-b8e4-43c0-b97a-633ae04b743d-kube-api-access-r2jwz\") pod \"51cb7657-b8e4-43c0-b97a-633ae04b743d\" (UID: \"51cb7657-b8e4-43c0-b97a-633ae04b743d\") " Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.149681 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51cb7657-b8e4-43c0-b97a-633ae04b743d-operator-scripts\") pod \"51cb7657-b8e4-43c0-b97a-633ae04b743d\" (UID: \"51cb7657-b8e4-43c0-b97a-633ae04b743d\") " Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.153137 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51cb7657-b8e4-43c0-b97a-633ae04b743d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51cb7657-b8e4-43c0-b97a-633ae04b743d" (UID: "51cb7657-b8e4-43c0-b97a-633ae04b743d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.158221 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51cb7657-b8e4-43c0-b97a-633ae04b743d-kube-api-access-r2jwz" (OuterVolumeSpecName: "kube-api-access-r2jwz") pod "51cb7657-b8e4-43c0-b97a-633ae04b743d" (UID: "51cb7657-b8e4-43c0-b97a-633ae04b743d"). InnerVolumeSpecName "kube-api-access-r2jwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.254769 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2jwz\" (UniqueName: \"kubernetes.io/projected/51cb7657-b8e4-43c0-b97a-633ae04b743d-kube-api-access-r2jwz\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.254791 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51cb7657-b8e4-43c0-b97a-633ae04b743d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.257788 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6c071d-fd2a-43e2-a7d2-c2499809aad0" path="/var/lib/kubelet/pods/3d6c071d-fd2a-43e2-a7d2-c2499809aad0/volumes" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.282692 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2316f8ea-3789-4702-91c2-a44da618bb8d","Type":"ContainerStarted","Data":"648bb67dcdbdc9d0064acdea63b2cf7a2423ae4f2bc58c4898fd4911d04a27e1"} Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.291232 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bae099e9-0e3a-440d-a3fb-c993aacb5014","Type":"ContainerStarted","Data":"ccd507a4ab39343141acb0e429b65f9373bf9cd43e20b564604dab1c083f825b"} Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.297909 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab67b9e9-4315-4390-b414-89b215ad823b","Type":"ContainerStarted","Data":"16218b77519e06685e0d2000bb6b0cc921d42e3ed5307742a0e7db31d1850fd7"} Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.300462 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8bv8v" event={"ID":"8b99e4c2-d0b8-4206-a0cf-45f43d0557cf","Type":"ContainerDied","Data":"64d997684ca60639714bc4656b81e26748bae91b601b394ceb549d7a9b1e36b8"} Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.300486 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64d997684ca60639714bc4656b81e26748bae91b601b394ceb549d7a9b1e36b8" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.300549 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8bv8v" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.304068 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c460-account-create-update-bp94q" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.305505 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c460-account-create-update-bp94q" event={"ID":"51cb7657-b8e4-43c0-b97a-633ae04b743d","Type":"ContainerDied","Data":"39f1a21e41df2b4a89c6ee6c54e17add3c638337f77e0a892f726a72c9651277"} Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.305564 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39f1a21e41df2b4a89c6ee6c54e17add3c638337f77e0a892f726a72c9651277" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.827923 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6966-account-create-update-9b6ww" Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.982144 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/680b6f89-d743-4467-b9da-68a831e24fa9-operator-scripts\") pod \"680b6f89-d743-4467-b9da-68a831e24fa9\" (UID: \"680b6f89-d743-4467-b9da-68a831e24fa9\") " Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.982602 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bghkq\" (UniqueName: \"kubernetes.io/projected/680b6f89-d743-4467-b9da-68a831e24fa9-kube-api-access-bghkq\") pod \"680b6f89-d743-4467-b9da-68a831e24fa9\" (UID: \"680b6f89-d743-4467-b9da-68a831e24fa9\") " Dec 01 15:21:38 crc kubenswrapper[4931]: I1201 15:21:38.983348 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/680b6f89-d743-4467-b9da-68a831e24fa9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "680b6f89-d743-4467-b9da-68a831e24fa9" (UID: "680b6f89-d743-4467-b9da-68a831e24fa9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.005245 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/680b6f89-d743-4467-b9da-68a831e24fa9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.012765 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680b6f89-d743-4467-b9da-68a831e24fa9-kube-api-access-bghkq" (OuterVolumeSpecName: "kube-api-access-bghkq") pod "680b6f89-d743-4467-b9da-68a831e24fa9" (UID: "680b6f89-d743-4467-b9da-68a831e24fa9"). InnerVolumeSpecName "kube-api-access-bghkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.060505 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pt2d4" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.107549 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bghkq\" (UniqueName: \"kubernetes.io/projected/680b6f89-d743-4467-b9da-68a831e24fa9-kube-api-access-bghkq\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.195499 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4s7mh" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.209088 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb9b2\" (UniqueName: \"kubernetes.io/projected/37513865-ccd3-4ad7-89f0-66e1f3f6b9a4-kube-api-access-cb9b2\") pod \"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4\" (UID: \"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4\") " Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.209295 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37513865-ccd3-4ad7-89f0-66e1f3f6b9a4-operator-scripts\") pod \"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4\" (UID: \"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4\") " Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.209304 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f730-account-create-update-g9cxx" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.210048 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37513865-ccd3-4ad7-89f0-66e1f3f6b9a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37513865-ccd3-4ad7-89f0-66e1f3f6b9a4" (UID: "37513865-ccd3-4ad7-89f0-66e1f3f6b9a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.217688 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37513865-ccd3-4ad7-89f0-66e1f3f6b9a4-kube-api-access-cb9b2" (OuterVolumeSpecName: "kube-api-access-cb9b2") pod "37513865-ccd3-4ad7-89f0-66e1f3f6b9a4" (UID: "37513865-ccd3-4ad7-89f0-66e1f3f6b9a4"). InnerVolumeSpecName "kube-api-access-cb9b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.310288 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjrfm\" (UniqueName: \"kubernetes.io/projected/f8d8e3fe-15ab-4d9d-939d-4198c8571597-kube-api-access-bjrfm\") pod \"f8d8e3fe-15ab-4d9d-939d-4198c8571597\" (UID: \"f8d8e3fe-15ab-4d9d-939d-4198c8571597\") " Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.310728 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqjcb\" (UniqueName: \"kubernetes.io/projected/55f40fe4-164c-4aad-9644-d509c905673f-kube-api-access-mqjcb\") pod \"55f40fe4-164c-4aad-9644-d509c905673f\" (UID: \"55f40fe4-164c-4aad-9644-d509c905673f\") " Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.310765 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f40fe4-164c-4aad-9644-d509c905673f-operator-scripts\") pod \"55f40fe4-164c-4aad-9644-d509c905673f\" (UID: \"55f40fe4-164c-4aad-9644-d509c905673f\") " Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.310802 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d8e3fe-15ab-4d9d-939d-4198c8571597-operator-scripts\") pod \"f8d8e3fe-15ab-4d9d-939d-4198c8571597\" (UID: \"f8d8e3fe-15ab-4d9d-939d-4198c8571597\") " Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.311240 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37513865-ccd3-4ad7-89f0-66e1f3f6b9a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.311252 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb9b2\" (UniqueName: \"kubernetes.io/projected/37513865-ccd3-4ad7-89f0-66e1f3f6b9a4-kube-api-access-cb9b2\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.311421 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55f40fe4-164c-4aad-9644-d509c905673f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55f40fe4-164c-4aad-9644-d509c905673f" (UID: "55f40fe4-164c-4aad-9644-d509c905673f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.311721 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d8e3fe-15ab-4d9d-939d-4198c8571597-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8d8e3fe-15ab-4d9d-939d-4198c8571597" (UID: "f8d8e3fe-15ab-4d9d-939d-4198c8571597"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.318567 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55f40fe4-164c-4aad-9644-d509c905673f-kube-api-access-mqjcb" (OuterVolumeSpecName: "kube-api-access-mqjcb") pod "55f40fe4-164c-4aad-9644-d509c905673f" (UID: "55f40fe4-164c-4aad-9644-d509c905673f"). InnerVolumeSpecName "kube-api-access-mqjcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.318603 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d8e3fe-15ab-4d9d-939d-4198c8571597-kube-api-access-bjrfm" (OuterVolumeSpecName: "kube-api-access-bjrfm") pod "f8d8e3fe-15ab-4d9d-939d-4198c8571597" (UID: "f8d8e3fe-15ab-4d9d-939d-4198c8571597"). InnerVolumeSpecName "kube-api-access-bjrfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.323580 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6966-account-create-update-9b6ww" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.323987 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6966-account-create-update-9b6ww" event={"ID":"680b6f89-d743-4467-b9da-68a831e24fa9","Type":"ContainerDied","Data":"1c9643bd4d5cf7ea0ef6cff85698217372bc7d33fc809c6505829bd112b2839f"} Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.324033 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c9643bd4d5cf7ea0ef6cff85698217372bc7d33fc809c6505829bd112b2839f" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.332204 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f730-account-create-update-g9cxx" event={"ID":"f8d8e3fe-15ab-4d9d-939d-4198c8571597","Type":"ContainerDied","Data":"dbc12bf57bae114991b0c44a2ed66de75357e30f5e82b4e081ea2e4ceffd6de8"} Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.332240 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbc12bf57bae114991b0c44a2ed66de75357e30f5e82b4e081ea2e4ceffd6de8" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.332302 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f730-account-create-update-g9cxx" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.347261 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pt2d4" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.347354 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pt2d4" event={"ID":"37513865-ccd3-4ad7-89f0-66e1f3f6b9a4","Type":"ContainerDied","Data":"38987410686d3645ab4e64b3854967ac7d5aa20c3aa63443a01ce22bface722c"} Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.347411 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38987410686d3645ab4e64b3854967ac7d5aa20c3aa63443a01ce22bface722c" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.359585 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2316f8ea-3789-4702-91c2-a44da618bb8d","Type":"ContainerStarted","Data":"5e7ac3756f105d2bafd33ed90b0c9676951c197b2b37d8b305f3d56932b83719"} Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.381705 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4s7mh" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.384189 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4s7mh" event={"ID":"55f40fe4-164c-4aad-9644-d509c905673f","Type":"ContainerDied","Data":"4d2288a85f149289c5ca254d9f77bf8cfa89fd5a7e6b6a026bce7676e3eaa3d9"} Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.384241 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d2288a85f149289c5ca254d9f77bf8cfa89fd5a7e6b6a026bce7676e3eaa3d9" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.387370 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cf859c4fb-g2pzn" event={"ID":"648f1c18-1467-4f3c-9ec3-8e1289c57a4f","Type":"ContainerDied","Data":"4d33ae8b0efc7fff609af493e5981b6538b653290a86dc610c8a12ad0a6c98c9"} Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.387454 4931 generic.go:334] "Generic (PLEG): container finished" podID="648f1c18-1467-4f3c-9ec3-8e1289c57a4f" containerID="4d33ae8b0efc7fff609af493e5981b6538b653290a86dc610c8a12ad0a6c98c9" exitCode=0 Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.399044 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.399025791 podStartE2EDuration="4.399025791s" podCreationTimestamp="2025-12-01 15:21:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:39.393157187 +0000 UTC m=+1245.819030854" watchObservedRunningTime="2025-12-01 15:21:39.399025791 +0000 UTC m=+1245.824899458" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.413530 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqjcb\" (UniqueName: \"kubernetes.io/projected/55f40fe4-164c-4aad-9644-d509c905673f-kube-api-access-mqjcb\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.413559 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55f40fe4-164c-4aad-9644-d509c905673f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.413591 4931 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8d8e3fe-15ab-4d9d-939d-4198c8571597-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.413604 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjrfm\" (UniqueName: \"kubernetes.io/projected/f8d8e3fe-15ab-4d9d-939d-4198c8571597-kube-api-access-bjrfm\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.784332 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.923907 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-config\") pod \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.923971 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-combined-ca-bundle\") pod \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.923994 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-httpd-config\") pod \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.924019 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-ovndb-tls-certs\") pod \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.924125 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k9cl\" (UniqueName: \"kubernetes.io/projected/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-kube-api-access-5k9cl\") pod \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\" (UID: \"648f1c18-1467-4f3c-9ec3-8e1289c57a4f\") " Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.944536 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "648f1c18-1467-4f3c-9ec3-8e1289c57a4f" (UID: "648f1c18-1467-4f3c-9ec3-8e1289c57a4f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.945550 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-kube-api-access-5k9cl" (OuterVolumeSpecName: "kube-api-access-5k9cl") pod "648f1c18-1467-4f3c-9ec3-8e1289c57a4f" (UID: "648f1c18-1467-4f3c-9ec3-8e1289c57a4f"). InnerVolumeSpecName "kube-api-access-5k9cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.984512 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "648f1c18-1467-4f3c-9ec3-8e1289c57a4f" (UID: "648f1c18-1467-4f3c-9ec3-8e1289c57a4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:39 crc kubenswrapper[4931]: I1201 15:21:39.987550 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-config" (OuterVolumeSpecName: "config") pod "648f1c18-1467-4f3c-9ec3-8e1289c57a4f" (UID: "648f1c18-1467-4f3c-9ec3-8e1289c57a4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.005299 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "648f1c18-1467-4f3c-9ec3-8e1289c57a4f" (UID: "648f1c18-1467-4f3c-9ec3-8e1289c57a4f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.026481 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.026529 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.026544 4931 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.026556 4931 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.026569 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k9cl\" (UniqueName: \"kubernetes.io/projected/648f1c18-1467-4f3c-9ec3-8e1289c57a4f-kube-api-access-5k9cl\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.413268 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab67b9e9-4315-4390-b414-89b215ad823b","Type":"ContainerStarted","Data":"cb0248d88b4bb630aaa4c2159d3b413f54a787a6d2620a10d42d7b2d4b87e40b"} Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.416785 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cf859c4fb-g2pzn" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.416897 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cf859c4fb-g2pzn" event={"ID":"648f1c18-1467-4f3c-9ec3-8e1289c57a4f","Type":"ContainerDied","Data":"23ff55ac72a9293b09f23682c396bcefce879e31dd82ba8f074f56ce45a62f7e"} Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.416957 4931 scope.go:117] "RemoveContainer" containerID="c624b834b4c99846acbfd0f012a6bcd0efadaca9bd00aea534cdf91fea2fd53a" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.424696 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2316f8ea-3789-4702-91c2-a44da618bb8d","Type":"ContainerStarted","Data":"8cfbb8d378b1ebee45592dde24e9fab568020872468f6fb51af51a067334ca8b"} Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.429974 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bae099e9-0e3a-440d-a3fb-c993aacb5014","Type":"ContainerStarted","Data":"feee504edc1663103ca486e5355b80982c675d47a4ff6f02f5f67f22503e14f4"} Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.447937 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6cf859c4fb-g2pzn"] Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.456291 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6cf859c4fb-g2pzn"] Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.458190 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.45816173 podStartE2EDuration="4.45816173s" podCreationTimestamp="2025-12-01 15:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:21:40.456866754 +0000 UTC m=+1246.882740441" watchObservedRunningTime="2025-12-01 15:21:40.45816173 +0000 UTC m=+1246.884035397" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.461869 4931 scope.go:117] "RemoveContainer" containerID="4d33ae8b0efc7fff609af493e5981b6538b653290a86dc610c8a12ad0a6c98c9" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.996527 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lbnzw"] Dec 01 15:21:40 crc kubenswrapper[4931]: E1201 15:21:40.997826 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55f40fe4-164c-4aad-9644-d509c905673f" containerName="mariadb-database-create" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.998101 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="55f40fe4-164c-4aad-9644-d509c905673f" containerName="mariadb-database-create" Dec 01 15:21:40 crc kubenswrapper[4931]: E1201 15:21:40.998343 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d8e3fe-15ab-4d9d-939d-4198c8571597" containerName="mariadb-account-create-update" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.998410 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d8e3fe-15ab-4d9d-939d-4198c8571597" containerName="mariadb-account-create-update" Dec 01 15:21:40 crc kubenswrapper[4931]: E1201 15:21:40.998476 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51cb7657-b8e4-43c0-b97a-633ae04b743d" containerName="mariadb-account-create-update" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.998527 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="51cb7657-b8e4-43c0-b97a-633ae04b743d" containerName="mariadb-account-create-update" Dec 01 15:21:40 crc kubenswrapper[4931]: E1201 15:21:40.998579 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="648f1c18-1467-4f3c-9ec3-8e1289c57a4f" containerName="neutron-httpd" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.999503 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="648f1c18-1467-4f3c-9ec3-8e1289c57a4f" containerName="neutron-httpd" Dec 01 15:21:40 crc kubenswrapper[4931]: E1201 15:21:40.999564 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="680b6f89-d743-4467-b9da-68a831e24fa9" containerName="mariadb-account-create-update" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.999613 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="680b6f89-d743-4467-b9da-68a831e24fa9" containerName="mariadb-account-create-update" Dec 01 15:21:40 crc kubenswrapper[4931]: E1201 15:21:40.999838 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="648f1c18-1467-4f3c-9ec3-8e1289c57a4f" containerName="neutron-api" Dec 01 15:21:40 crc kubenswrapper[4931]: I1201 15:21:40.999897 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="648f1c18-1467-4f3c-9ec3-8e1289c57a4f" containerName="neutron-api" Dec 01 15:21:40 crc kubenswrapper[4931]: E1201 15:21:40.999958 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b99e4c2-d0b8-4206-a0cf-45f43d0557cf" containerName="mariadb-database-create" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.000006 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b99e4c2-d0b8-4206-a0cf-45f43d0557cf" containerName="mariadb-database-create" Dec 01 15:21:41 crc kubenswrapper[4931]: E1201 15:21:41.000076 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37513865-ccd3-4ad7-89f0-66e1f3f6b9a4" containerName="mariadb-database-create" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.000126 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="37513865-ccd3-4ad7-89f0-66e1f3f6b9a4" containerName="mariadb-database-create" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.000415 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="648f1c18-1467-4f3c-9ec3-8e1289c57a4f" containerName="neutron-httpd" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.000481 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="680b6f89-d743-4467-b9da-68a831e24fa9" containerName="mariadb-account-create-update" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.000537 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b99e4c2-d0b8-4206-a0cf-45f43d0557cf" containerName="mariadb-database-create" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.000586 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="37513865-ccd3-4ad7-89f0-66e1f3f6b9a4" containerName="mariadb-database-create" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.000657 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d8e3fe-15ab-4d9d-939d-4198c8571597" containerName="mariadb-account-create-update" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.000714 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="55f40fe4-164c-4aad-9644-d509c905673f" containerName="mariadb-database-create" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.000764 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="648f1c18-1467-4f3c-9ec3-8e1289c57a4f" containerName="neutron-api" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.000815 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="51cb7657-b8e4-43c0-b97a-633ae04b743d" containerName="mariadb-account-create-update" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.001438 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.004190 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wxvrz" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.004573 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.004752 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.023219 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lbnzw"] Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.145991 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-config-data\") pod \"nova-cell0-conductor-db-sync-lbnzw\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.146046 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv4bk\" (UniqueName: \"kubernetes.io/projected/2e39a0af-5976-435f-b5c5-6e1e820ea761-kube-api-access-xv4bk\") pod \"nova-cell0-conductor-db-sync-lbnzw\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.146131 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-scripts\") pod \"nova-cell0-conductor-db-sync-lbnzw\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.146166 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lbnzw\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.248532 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-config-data\") pod \"nova-cell0-conductor-db-sync-lbnzw\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.248581 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv4bk\" (UniqueName: \"kubernetes.io/projected/2e39a0af-5976-435f-b5c5-6e1e820ea761-kube-api-access-xv4bk\") pod \"nova-cell0-conductor-db-sync-lbnzw\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.249009 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-scripts\") pod \"nova-cell0-conductor-db-sync-lbnzw\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.249535 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lbnzw\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.254051 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-scripts\") pod \"nova-cell0-conductor-db-sync-lbnzw\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.254164 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-config-data\") pod \"nova-cell0-conductor-db-sync-lbnzw\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.254583 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lbnzw\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.274043 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv4bk\" (UniqueName: \"kubernetes.io/projected/2e39a0af-5976-435f-b5c5-6e1e820ea761-kube-api-access-xv4bk\") pod \"nova-cell0-conductor-db-sync-lbnzw\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.329043 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.442968 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.446577 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7465544595-sc668" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.456300 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bae099e9-0e3a-440d-a3fb-c993aacb5014","Type":"ContainerStarted","Data":"ca606e4bf030f2828d28220ac6de597b895f186387360a57ca2484047aad6d40"} Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.456564 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.527054 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.105147029 podStartE2EDuration="7.527033883s" podCreationTimestamp="2025-12-01 15:21:34 +0000 UTC" firstStartedPulling="2025-12-01 15:21:35.345184849 +0000 UTC m=+1241.771058516" lastFinishedPulling="2025-12-01 15:21:40.767071693 +0000 UTC m=+1247.192945370" observedRunningTime="2025-12-01 15:21:41.49554486 +0000 UTC m=+1247.921418537" watchObservedRunningTime="2025-12-01 15:21:41.527033883 +0000 UTC m=+1247.952907540" Dec 01 15:21:41 crc kubenswrapper[4931]: I1201 15:21:41.863513 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lbnzw"] Dec 01 15:21:41 crc kubenswrapper[4931]: W1201 15:21:41.872076 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e39a0af_5976_435f_b5c5_6e1e820ea761.slice/crio-087ae801b104491010bd2e430abc3b8dc237bddc21e4aaca9ffc0fadf44ecf77 WatchSource:0}: Error finding container 087ae801b104491010bd2e430abc3b8dc237bddc21e4aaca9ffc0fadf44ecf77: Status 404 returned error can't find the container with id 087ae801b104491010bd2e430abc3b8dc237bddc21e4aaca9ffc0fadf44ecf77 Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.278829 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="648f1c18-1467-4f3c-9ec3-8e1289c57a4f" path="/var/lib/kubelet/pods/648f1c18-1467-4f3c-9ec3-8e1289c57a4f/volumes" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.361585 4931 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pode63c8e3a-39a0-4a76-9754-16250b21f1dc"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pode63c8e3a-39a0-4a76-9754-16250b21f1dc] : Timed out while waiting for systemd to remove kubepods-besteffort-pode63c8e3a_39a0_4a76_9754_16250b21f1dc.slice" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.478063 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lbnzw" event={"ID":"2e39a0af-5976-435f-b5c5-6e1e820ea761","Type":"ContainerStarted","Data":"087ae801b104491010bd2e430abc3b8dc237bddc21e4aaca9ffc0fadf44ecf77"} Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.480067 4931 generic.go:334] "Generic (PLEG): container finished" podID="97ed61f3-8ca0-4aee-afae-168398babe70" containerID="91144dfad5dcee4bce9e3dffbc29f3988a0e250ee32ee1cfa8b7b6b0c59d19c5" exitCode=137 Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.480159 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6479b7c68-txrvx" event={"ID":"97ed61f3-8ca0-4aee-afae-168398babe70","Type":"ContainerDied","Data":"91144dfad5dcee4bce9e3dffbc29f3988a0e250ee32ee1cfa8b7b6b0c59d19c5"} Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.765120 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.880505 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97ed61f3-8ca0-4aee-afae-168398babe70-config-data\") pod \"97ed61f3-8ca0-4aee-afae-168398babe70\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.880580 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-combined-ca-bundle\") pod \"97ed61f3-8ca0-4aee-afae-168398babe70\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.880661 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-horizon-tls-certs\") pod \"97ed61f3-8ca0-4aee-afae-168398babe70\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.880697 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-horizon-secret-key\") pod \"97ed61f3-8ca0-4aee-afae-168398babe70\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.880730 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ed61f3-8ca0-4aee-afae-168398babe70-logs\") pod \"97ed61f3-8ca0-4aee-afae-168398babe70\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.880893 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2l7x\" (UniqueName: \"kubernetes.io/projected/97ed61f3-8ca0-4aee-afae-168398babe70-kube-api-access-m2l7x\") pod \"97ed61f3-8ca0-4aee-afae-168398babe70\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.880956 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97ed61f3-8ca0-4aee-afae-168398babe70-scripts\") pod \"97ed61f3-8ca0-4aee-afae-168398babe70\" (UID: \"97ed61f3-8ca0-4aee-afae-168398babe70\") " Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.883108 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97ed61f3-8ca0-4aee-afae-168398babe70-logs" (OuterVolumeSpecName: "logs") pod "97ed61f3-8ca0-4aee-afae-168398babe70" (UID: "97ed61f3-8ca0-4aee-afae-168398babe70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.890586 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ed61f3-8ca0-4aee-afae-168398babe70-kube-api-access-m2l7x" (OuterVolumeSpecName: "kube-api-access-m2l7x") pod "97ed61f3-8ca0-4aee-afae-168398babe70" (UID: "97ed61f3-8ca0-4aee-afae-168398babe70"). InnerVolumeSpecName "kube-api-access-m2l7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.914466 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "97ed61f3-8ca0-4aee-afae-168398babe70" (UID: "97ed61f3-8ca0-4aee-afae-168398babe70"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.915580 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ed61f3-8ca0-4aee-afae-168398babe70-scripts" (OuterVolumeSpecName: "scripts") pod "97ed61f3-8ca0-4aee-afae-168398babe70" (UID: "97ed61f3-8ca0-4aee-afae-168398babe70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.928941 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97ed61f3-8ca0-4aee-afae-168398babe70" (UID: "97ed61f3-8ca0-4aee-afae-168398babe70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.946275 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ed61f3-8ca0-4aee-afae-168398babe70-config-data" (OuterVolumeSpecName: "config-data") pod "97ed61f3-8ca0-4aee-afae-168398babe70" (UID: "97ed61f3-8ca0-4aee-afae-168398babe70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.946473 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "97ed61f3-8ca0-4aee-afae-168398babe70" (UID: "97ed61f3-8ca0-4aee-afae-168398babe70"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.983184 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97ed61f3-8ca0-4aee-afae-168398babe70-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.983218 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.983231 4931 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.983241 4931 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/97ed61f3-8ca0-4aee-afae-168398babe70-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.983266 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ed61f3-8ca0-4aee-afae-168398babe70-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.983277 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2l7x\" (UniqueName: \"kubernetes.io/projected/97ed61f3-8ca0-4aee-afae-168398babe70-kube-api-access-m2l7x\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:42 crc kubenswrapper[4931]: I1201 15:21:42.983288 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97ed61f3-8ca0-4aee-afae-168398babe70-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:43 crc kubenswrapper[4931]: I1201 15:21:43.525412 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6479b7c68-txrvx" event={"ID":"97ed61f3-8ca0-4aee-afae-168398babe70","Type":"ContainerDied","Data":"5131abc1b85f2d20da4a7776bc6d7fd12f68918cf3a16618e728bc5936ca81fe"} Dec 01 15:21:43 crc kubenswrapper[4931]: I1201 15:21:43.525468 4931 scope.go:117] "RemoveContainer" containerID="63a67a4f00ff8d02071cd5c46b0692c9f484b2d61acc41dcf8b2b53a1f51fdb8" Dec 01 15:21:43 crc kubenswrapper[4931]: I1201 15:21:43.526121 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6479b7c68-txrvx" Dec 01 15:21:43 crc kubenswrapper[4931]: I1201 15:21:43.566054 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6479b7c68-txrvx"] Dec 01 15:21:43 crc kubenswrapper[4931]: I1201 15:21:43.573520 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6479b7c68-txrvx"] Dec 01 15:21:43 crc kubenswrapper[4931]: I1201 15:21:43.725111 4931 scope.go:117] "RemoveContainer" containerID="91144dfad5dcee4bce9e3dffbc29f3988a0e250ee32ee1cfa8b7b6b0c59d19c5" Dec 01 15:21:44 crc kubenswrapper[4931]: I1201 15:21:44.256210 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ed61f3-8ca0-4aee-afae-168398babe70" path="/var/lib/kubelet/pods/97ed61f3-8ca0-4aee-afae-168398babe70/volumes" Dec 01 15:21:46 crc kubenswrapper[4931]: I1201 15:21:46.074356 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 15:21:46 crc kubenswrapper[4931]: I1201 15:21:46.074441 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 15:21:46 crc kubenswrapper[4931]: I1201 15:21:46.114412 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 15:21:46 crc kubenswrapper[4931]: I1201 15:21:46.126005 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 15:21:46 crc kubenswrapper[4931]: I1201 15:21:46.562977 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 15:21:46 crc kubenswrapper[4931]: I1201 15:21:46.563295 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 15:21:46 crc kubenswrapper[4931]: I1201 15:21:46.818321 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 15:21:46 crc kubenswrapper[4931]: I1201 15:21:46.818417 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 15:21:46 crc kubenswrapper[4931]: I1201 15:21:46.850521 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 15:21:46 crc kubenswrapper[4931]: I1201 15:21:46.865403 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 15:21:47 crc kubenswrapper[4931]: I1201 15:21:47.572864 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 15:21:47 crc kubenswrapper[4931]: I1201 15:21:47.573198 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 15:21:47 crc kubenswrapper[4931]: I1201 15:21:47.862725 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:47 crc kubenswrapper[4931]: I1201 15:21:47.863066 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="ceilometer-central-agent" containerID="cri-o://f4ecd2a717bf16aff9335d90c55b4a62cebbfa7e4f9c525e8f12d3f165ed11ac" gracePeriod=30 Dec 01 15:21:47 crc kubenswrapper[4931]: I1201 15:21:47.863843 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="proxy-httpd" containerID="cri-o://ca606e4bf030f2828d28220ac6de597b895f186387360a57ca2484047aad6d40" gracePeriod=30 Dec 01 15:21:47 crc kubenswrapper[4931]: I1201 15:21:47.863915 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="sg-core" containerID="cri-o://feee504edc1663103ca486e5355b80982c675d47a4ff6f02f5f67f22503e14f4" gracePeriod=30 Dec 01 15:21:47 crc kubenswrapper[4931]: I1201 15:21:47.864007 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="ceilometer-notification-agent" containerID="cri-o://ccd507a4ab39343141acb0e429b65f9373bf9cd43e20b564604dab1c083f825b" gracePeriod=30 Dec 01 15:21:48 crc kubenswrapper[4931]: I1201 15:21:48.581605 4931 generic.go:334] "Generic (PLEG): container finished" podID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerID="ca606e4bf030f2828d28220ac6de597b895f186387360a57ca2484047aad6d40" exitCode=0 Dec 01 15:21:48 crc kubenswrapper[4931]: I1201 15:21:48.581640 4931 generic.go:334] "Generic (PLEG): container finished" podID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerID="feee504edc1663103ca486e5355b80982c675d47a4ff6f02f5f67f22503e14f4" exitCode=2 Dec 01 15:21:48 crc kubenswrapper[4931]: I1201 15:21:48.581647 4931 generic.go:334] "Generic (PLEG): container finished" podID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerID="f4ecd2a717bf16aff9335d90c55b4a62cebbfa7e4f9c525e8f12d3f165ed11ac" exitCode=0 Dec 01 15:21:48 crc kubenswrapper[4931]: I1201 15:21:48.581728 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bae099e9-0e3a-440d-a3fb-c993aacb5014","Type":"ContainerDied","Data":"ca606e4bf030f2828d28220ac6de597b895f186387360a57ca2484047aad6d40"} Dec 01 15:21:48 crc kubenswrapper[4931]: I1201 15:21:48.581791 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bae099e9-0e3a-440d-a3fb-c993aacb5014","Type":"ContainerDied","Data":"feee504edc1663103ca486e5355b80982c675d47a4ff6f02f5f67f22503e14f4"} Dec 01 15:21:48 crc kubenswrapper[4931]: I1201 15:21:48.581802 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bae099e9-0e3a-440d-a3fb-c993aacb5014","Type":"ContainerDied","Data":"f4ecd2a717bf16aff9335d90c55b4a62cebbfa7e4f9c525e8f12d3f165ed11ac"} Dec 01 15:21:48 crc kubenswrapper[4931]: I1201 15:21:48.766027 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 15:21:48 crc kubenswrapper[4931]: I1201 15:21:48.766133 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 15:21:48 crc kubenswrapper[4931]: I1201 15:21:48.775881 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 15:21:49 crc kubenswrapper[4931]: I1201 15:21:49.590407 4931 generic.go:334] "Generic (PLEG): container finished" podID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerID="ccd507a4ab39343141acb0e429b65f9373bf9cd43e20b564604dab1c083f825b" exitCode=0 Dec 01 15:21:49 crc kubenswrapper[4931]: I1201 15:21:49.590696 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 15:21:49 crc kubenswrapper[4931]: I1201 15:21:49.590704 4931 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 15:21:49 crc kubenswrapper[4931]: I1201 15:21:49.591273 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bae099e9-0e3a-440d-a3fb-c993aacb5014","Type":"ContainerDied","Data":"ccd507a4ab39343141acb0e429b65f9373bf9cd43e20b564604dab1c083f825b"} Dec 01 15:21:49 crc kubenswrapper[4931]: I1201 15:21:49.666947 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 15:21:49 crc kubenswrapper[4931]: I1201 15:21:49.872404 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:21:49 crc kubenswrapper[4931]: I1201 15:21:49.872468 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:21:49 crc kubenswrapper[4931]: I1201 15:21:49.872514 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:21:49 crc kubenswrapper[4931]: I1201 15:21:49.873269 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"57835c837fadcd2c88b5f726b0f5a7aef7db7caf224620b840275c3b23741956"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 15:21:49 crc kubenswrapper[4931]: I1201 15:21:49.873323 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://57835c837fadcd2c88b5f726b0f5a7aef7db7caf224620b840275c3b23741956" gracePeriod=600 Dec 01 15:21:49 crc kubenswrapper[4931]: I1201 15:21:49.950850 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 15:21:50 crc kubenswrapper[4931]: I1201 15:21:50.603109 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="57835c837fadcd2c88b5f726b0f5a7aef7db7caf224620b840275c3b23741956" exitCode=0 Dec 01 15:21:50 crc kubenswrapper[4931]: I1201 15:21:50.603171 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"57835c837fadcd2c88b5f726b0f5a7aef7db7caf224620b840275c3b23741956"} Dec 01 15:21:50 crc kubenswrapper[4931]: I1201 15:21:50.603560 4931 scope.go:117] "RemoveContainer" containerID="4f551f2eb27cc6e8158d6be30d6ee18e92fc02ccf79b3c9e4d9f5dcf4740103b" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.448127 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.565134 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bae099e9-0e3a-440d-a3fb-c993aacb5014-log-httpd\") pod \"bae099e9-0e3a-440d-a3fb-c993aacb5014\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.565225 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-combined-ca-bundle\") pod \"bae099e9-0e3a-440d-a3fb-c993aacb5014\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.565314 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj75k\" (UniqueName: \"kubernetes.io/projected/bae099e9-0e3a-440d-a3fb-c993aacb5014-kube-api-access-bj75k\") pod \"bae099e9-0e3a-440d-a3fb-c993aacb5014\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.565367 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-sg-core-conf-yaml\") pod \"bae099e9-0e3a-440d-a3fb-c993aacb5014\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.565496 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-scripts\") pod \"bae099e9-0e3a-440d-a3fb-c993aacb5014\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.565531 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-config-data\") pod \"bae099e9-0e3a-440d-a3fb-c993aacb5014\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.565557 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bae099e9-0e3a-440d-a3fb-c993aacb5014-run-httpd\") pod \"bae099e9-0e3a-440d-a3fb-c993aacb5014\" (UID: \"bae099e9-0e3a-440d-a3fb-c993aacb5014\") " Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.566306 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bae099e9-0e3a-440d-a3fb-c993aacb5014-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bae099e9-0e3a-440d-a3fb-c993aacb5014" (UID: "bae099e9-0e3a-440d-a3fb-c993aacb5014"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.566608 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bae099e9-0e3a-440d-a3fb-c993aacb5014-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bae099e9-0e3a-440d-a3fb-c993aacb5014" (UID: "bae099e9-0e3a-440d-a3fb-c993aacb5014"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.573215 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae099e9-0e3a-440d-a3fb-c993aacb5014-kube-api-access-bj75k" (OuterVolumeSpecName: "kube-api-access-bj75k") pod "bae099e9-0e3a-440d-a3fb-c993aacb5014" (UID: "bae099e9-0e3a-440d-a3fb-c993aacb5014"). InnerVolumeSpecName "kube-api-access-bj75k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.573815 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-scripts" (OuterVolumeSpecName: "scripts") pod "bae099e9-0e3a-440d-a3fb-c993aacb5014" (UID: "bae099e9-0e3a-440d-a3fb-c993aacb5014"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.596758 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bae099e9-0e3a-440d-a3fb-c993aacb5014" (UID: "bae099e9-0e3a-440d-a3fb-c993aacb5014"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.660943 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bae099e9-0e3a-440d-a3fb-c993aacb5014","Type":"ContainerDied","Data":"f0e86c11a7c3352aed25f38e28920d2ee0f77663e56cd12e2a0fd5edde71a763"} Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.661008 4931 scope.go:117] "RemoveContainer" containerID="ca606e4bf030f2828d28220ac6de597b895f186387360a57ca2484047aad6d40" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.661152 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.667204 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.667240 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bae099e9-0e3a-440d-a3fb-c993aacb5014-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.667251 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bae099e9-0e3a-440d-a3fb-c993aacb5014-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.667259 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj75k\" (UniqueName: \"kubernetes.io/projected/bae099e9-0e3a-440d-a3fb-c993aacb5014-kube-api-access-bj75k\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.667269 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.670922 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lbnzw" event={"ID":"2e39a0af-5976-435f-b5c5-6e1e820ea761","Type":"ContainerStarted","Data":"6f6c2ab96e502e1e1082e5c5fc79b7f1a5c8684fdd852436e087527d8efec353"} Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.695621 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"58f06495cba8dbbb838e05fa0d374d0f9cc22d5fa8c965e16e9109a8373c2319"} Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.697675 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bae099e9-0e3a-440d-a3fb-c993aacb5014" (UID: "bae099e9-0e3a-440d-a3fb-c993aacb5014"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.703583 4931 scope.go:117] "RemoveContainer" containerID="feee504edc1663103ca486e5355b80982c675d47a4ff6f02f5f67f22503e14f4" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.720091 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-lbnzw" podStartSLOduration=2.452627507 podStartE2EDuration="12.720074614s" podCreationTimestamp="2025-12-01 15:21:40 +0000 UTC" firstStartedPulling="2025-12-01 15:21:41.874181397 +0000 UTC m=+1248.300055064" lastFinishedPulling="2025-12-01 15:21:52.141628504 +0000 UTC m=+1258.567502171" observedRunningTime="2025-12-01 15:21:52.694428605 +0000 UTC m=+1259.120302292" watchObservedRunningTime="2025-12-01 15:21:52.720074614 +0000 UTC m=+1259.145948281" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.756642 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-config-data" (OuterVolumeSpecName: "config-data") pod "bae099e9-0e3a-440d-a3fb-c993aacb5014" (UID: "bae099e9-0e3a-440d-a3fb-c993aacb5014"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.770576 4931 scope.go:117] "RemoveContainer" containerID="ccd507a4ab39343141acb0e429b65f9373bf9cd43e20b564604dab1c083f825b" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.772585 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.772616 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae099e9-0e3a-440d-a3fb-c993aacb5014-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.803691 4931 scope.go:117] "RemoveContainer" containerID="f4ecd2a717bf16aff9335d90c55b4a62cebbfa7e4f9c525e8f12d3f165ed11ac" Dec 01 15:21:52 crc kubenswrapper[4931]: I1201 15:21:52.997863 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.011841 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.025293 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:53 crc kubenswrapper[4931]: E1201 15:21:53.025769 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="sg-core" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.025788 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="sg-core" Dec 01 15:21:53 crc kubenswrapper[4931]: E1201 15:21:53.025807 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="ceilometer-central-agent" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.025814 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="ceilometer-central-agent" Dec 01 15:21:53 crc kubenswrapper[4931]: E1201 15:21:53.025830 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ed61f3-8ca0-4aee-afae-168398babe70" containerName="horizon" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.025836 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ed61f3-8ca0-4aee-afae-168398babe70" containerName="horizon" Dec 01 15:21:53 crc kubenswrapper[4931]: E1201 15:21:53.025856 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ed61f3-8ca0-4aee-afae-168398babe70" containerName="horizon-log" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.025862 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ed61f3-8ca0-4aee-afae-168398babe70" containerName="horizon-log" Dec 01 15:21:53 crc kubenswrapper[4931]: E1201 15:21:53.025879 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="proxy-httpd" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.025886 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="proxy-httpd" Dec 01 15:21:53 crc kubenswrapper[4931]: E1201 15:21:53.025898 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="ceilometer-notification-agent" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.025904 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="ceilometer-notification-agent" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.026073 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ed61f3-8ca0-4aee-afae-168398babe70" containerName="horizon-log" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.026088 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="ceilometer-notification-agent" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.026101 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ed61f3-8ca0-4aee-afae-168398babe70" containerName="horizon" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.026114 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="sg-core" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.026124 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="proxy-httpd" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.026139 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" containerName="ceilometer-central-agent" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.027773 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.029730 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.030083 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.032755 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.178727 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-config-data\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.178797 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.179075 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-scripts\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.179118 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/306db296-7d68-4bb8-b723-782c6671f98c-log-httpd\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.179215 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.179277 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbsrx\" (UniqueName: \"kubernetes.io/projected/306db296-7d68-4bb8-b723-782c6671f98c-kube-api-access-fbsrx\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.179302 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/306db296-7d68-4bb8-b723-782c6671f98c-run-httpd\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.280773 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-scripts\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.280822 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/306db296-7d68-4bb8-b723-782c6671f98c-log-httpd\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.280867 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.280896 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbsrx\" (UniqueName: \"kubernetes.io/projected/306db296-7d68-4bb8-b723-782c6671f98c-kube-api-access-fbsrx\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.280913 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/306db296-7d68-4bb8-b723-782c6671f98c-run-httpd\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.280956 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-config-data\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.280991 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.281351 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/306db296-7d68-4bb8-b723-782c6671f98c-log-httpd\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.281914 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/306db296-7d68-4bb8-b723-782c6671f98c-run-httpd\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.286343 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-config-data\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.286832 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.287664 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-scripts\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.288651 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.298955 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbsrx\" (UniqueName: \"kubernetes.io/projected/306db296-7d68-4bb8-b723-782c6671f98c-kube-api-access-fbsrx\") pod \"ceilometer-0\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.400162 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:21:53 crc kubenswrapper[4931]: I1201 15:21:53.903930 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:53 crc kubenswrapper[4931]: W1201 15:21:53.904554 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod306db296_7d68_4bb8_b723_782c6671f98c.slice/crio-76f375561032898dea900ed3e1bc9c92b23a43e58aca0eed6996eb1a1620dc2c WatchSource:0}: Error finding container 76f375561032898dea900ed3e1bc9c92b23a43e58aca0eed6996eb1a1620dc2c: Status 404 returned error can't find the container with id 76f375561032898dea900ed3e1bc9c92b23a43e58aca0eed6996eb1a1620dc2c Dec 01 15:21:54 crc kubenswrapper[4931]: I1201 15:21:54.260377 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bae099e9-0e3a-440d-a3fb-c993aacb5014" path="/var/lib/kubelet/pods/bae099e9-0e3a-440d-a3fb-c993aacb5014/volumes" Dec 01 15:21:54 crc kubenswrapper[4931]: I1201 15:21:54.721560 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"306db296-7d68-4bb8-b723-782c6671f98c","Type":"ContainerStarted","Data":"76f375561032898dea900ed3e1bc9c92b23a43e58aca0eed6996eb1a1620dc2c"} Dec 01 15:21:55 crc kubenswrapper[4931]: I1201 15:21:55.732609 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"306db296-7d68-4bb8-b723-782c6671f98c","Type":"ContainerStarted","Data":"1126984e6d5e4a64f5f10155b5a365448db25ae4b1f2c3b2031560fea82db4f2"} Dec 01 15:21:55 crc kubenswrapper[4931]: I1201 15:21:55.732924 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"306db296-7d68-4bb8-b723-782c6671f98c","Type":"ContainerStarted","Data":"c3357a942af2ac1cab5eca60ca012c3756f0a73d2b38703faf0fd38f68e1c6e1"} Dec 01 15:21:59 crc kubenswrapper[4931]: I1201 15:21:59.713919 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:21:59 crc kubenswrapper[4931]: I1201 15:21:59.785362 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"306db296-7d68-4bb8-b723-782c6671f98c","Type":"ContainerStarted","Data":"1dfcf138d6cfa4355c07639aa954ef6b6506e9fcbfa6587f3e09dd6d8f8b7ab1"} Dec 01 15:22:02 crc kubenswrapper[4931]: I1201 15:22:02.820004 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"306db296-7d68-4bb8-b723-782c6671f98c","Type":"ContainerStarted","Data":"74fc67e0951be39c0aabc699b6b5949b41f004e5496777839d2abe2cca7752c3"} Dec 01 15:22:02 crc kubenswrapper[4931]: I1201 15:22:02.820537 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 15:22:02 crc kubenswrapper[4931]: I1201 15:22:02.820137 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="ceilometer-central-agent" containerID="cri-o://c3357a942af2ac1cab5eca60ca012c3756f0a73d2b38703faf0fd38f68e1c6e1" gracePeriod=30 Dec 01 15:22:02 crc kubenswrapper[4931]: I1201 15:22:02.820607 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="sg-core" containerID="cri-o://1dfcf138d6cfa4355c07639aa954ef6b6506e9fcbfa6587f3e09dd6d8f8b7ab1" gracePeriod=30 Dec 01 15:22:02 crc kubenswrapper[4931]: I1201 15:22:02.820612 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="proxy-httpd" containerID="cri-o://74fc67e0951be39c0aabc699b6b5949b41f004e5496777839d2abe2cca7752c3" gracePeriod=30 Dec 01 15:22:02 crc kubenswrapper[4931]: I1201 15:22:02.820670 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="ceilometer-notification-agent" containerID="cri-o://1126984e6d5e4a64f5f10155b5a365448db25ae4b1f2c3b2031560fea82db4f2" gracePeriod=30 Dec 01 15:22:02 crc kubenswrapper[4931]: I1201 15:22:02.848858 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.20634147 podStartE2EDuration="10.848831132s" podCreationTimestamp="2025-12-01 15:21:52 +0000 UTC" firstStartedPulling="2025-12-01 15:21:53.907244543 +0000 UTC m=+1260.333118210" lastFinishedPulling="2025-12-01 15:22:02.549734205 +0000 UTC m=+1268.975607872" observedRunningTime="2025-12-01 15:22:02.83737243 +0000 UTC m=+1269.263246107" watchObservedRunningTime="2025-12-01 15:22:02.848831132 +0000 UTC m=+1269.274704799" Dec 01 15:22:03 crc kubenswrapper[4931]: I1201 15:22:03.834715 4931 generic.go:334] "Generic (PLEG): container finished" podID="306db296-7d68-4bb8-b723-782c6671f98c" containerID="1dfcf138d6cfa4355c07639aa954ef6b6506e9fcbfa6587f3e09dd6d8f8b7ab1" exitCode=2 Dec 01 15:22:03 crc kubenswrapper[4931]: I1201 15:22:03.835026 4931 generic.go:334] "Generic (PLEG): container finished" podID="306db296-7d68-4bb8-b723-782c6671f98c" containerID="1126984e6d5e4a64f5f10155b5a365448db25ae4b1f2c3b2031560fea82db4f2" exitCode=0 Dec 01 15:22:03 crc kubenswrapper[4931]: I1201 15:22:03.834830 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"306db296-7d68-4bb8-b723-782c6671f98c","Type":"ContainerDied","Data":"1dfcf138d6cfa4355c07639aa954ef6b6506e9fcbfa6587f3e09dd6d8f8b7ab1"} Dec 01 15:22:03 crc kubenswrapper[4931]: I1201 15:22:03.835070 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"306db296-7d68-4bb8-b723-782c6671f98c","Type":"ContainerDied","Data":"1126984e6d5e4a64f5f10155b5a365448db25ae4b1f2c3b2031560fea82db4f2"} Dec 01 15:22:08 crc kubenswrapper[4931]: I1201 15:22:08.891304 4931 generic.go:334] "Generic (PLEG): container finished" podID="306db296-7d68-4bb8-b723-782c6671f98c" containerID="c3357a942af2ac1cab5eca60ca012c3756f0a73d2b38703faf0fd38f68e1c6e1" exitCode=0 Dec 01 15:22:08 crc kubenswrapper[4931]: I1201 15:22:08.891495 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"306db296-7d68-4bb8-b723-782c6671f98c","Type":"ContainerDied","Data":"c3357a942af2ac1cab5eca60ca012c3756f0a73d2b38703faf0fd38f68e1c6e1"} Dec 01 15:22:09 crc kubenswrapper[4931]: I1201 15:22:09.903922 4931 generic.go:334] "Generic (PLEG): container finished" podID="2e39a0af-5976-435f-b5c5-6e1e820ea761" containerID="6f6c2ab96e502e1e1082e5c5fc79b7f1a5c8684fdd852436e087527d8efec353" exitCode=0 Dec 01 15:22:09 crc kubenswrapper[4931]: I1201 15:22:09.904004 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lbnzw" event={"ID":"2e39a0af-5976-435f-b5c5-6e1e820ea761","Type":"ContainerDied","Data":"6f6c2ab96e502e1e1082e5c5fc79b7f1a5c8684fdd852436e087527d8efec353"} Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.276968 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.387680 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-config-data\") pod \"2e39a0af-5976-435f-b5c5-6e1e820ea761\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.387756 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv4bk\" (UniqueName: \"kubernetes.io/projected/2e39a0af-5976-435f-b5c5-6e1e820ea761-kube-api-access-xv4bk\") pod \"2e39a0af-5976-435f-b5c5-6e1e820ea761\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.387783 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-combined-ca-bundle\") pod \"2e39a0af-5976-435f-b5c5-6e1e820ea761\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.387854 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-scripts\") pod \"2e39a0af-5976-435f-b5c5-6e1e820ea761\" (UID: \"2e39a0af-5976-435f-b5c5-6e1e820ea761\") " Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.393823 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-scripts" (OuterVolumeSpecName: "scripts") pod "2e39a0af-5976-435f-b5c5-6e1e820ea761" (UID: "2e39a0af-5976-435f-b5c5-6e1e820ea761"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.394186 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e39a0af-5976-435f-b5c5-6e1e820ea761-kube-api-access-xv4bk" (OuterVolumeSpecName: "kube-api-access-xv4bk") pod "2e39a0af-5976-435f-b5c5-6e1e820ea761" (UID: "2e39a0af-5976-435f-b5c5-6e1e820ea761"). InnerVolumeSpecName "kube-api-access-xv4bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.416815 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-config-data" (OuterVolumeSpecName: "config-data") pod "2e39a0af-5976-435f-b5c5-6e1e820ea761" (UID: "2e39a0af-5976-435f-b5c5-6e1e820ea761"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.423614 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e39a0af-5976-435f-b5c5-6e1e820ea761" (UID: "2e39a0af-5976-435f-b5c5-6e1e820ea761"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.489762 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.489802 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv4bk\" (UniqueName: \"kubernetes.io/projected/2e39a0af-5976-435f-b5c5-6e1e820ea761-kube-api-access-xv4bk\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.489815 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.489827 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e39a0af-5976-435f-b5c5-6e1e820ea761-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.926591 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lbnzw" event={"ID":"2e39a0af-5976-435f-b5c5-6e1e820ea761","Type":"ContainerDied","Data":"087ae801b104491010bd2e430abc3b8dc237bddc21e4aaca9ffc0fadf44ecf77"} Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.926915 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="087ae801b104491010bd2e430abc3b8dc237bddc21e4aaca9ffc0fadf44ecf77" Dec 01 15:22:11 crc kubenswrapper[4931]: I1201 15:22:11.926656 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lbnzw" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.016375 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 15:22:12 crc kubenswrapper[4931]: E1201 15:22:12.016943 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e39a0af-5976-435f-b5c5-6e1e820ea761" containerName="nova-cell0-conductor-db-sync" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.016972 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e39a0af-5976-435f-b5c5-6e1e820ea761" containerName="nova-cell0-conductor-db-sync" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.017208 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e39a0af-5976-435f-b5c5-6e1e820ea761" containerName="nova-cell0-conductor-db-sync" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.018040 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.020646 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wxvrz" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.024645 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.032957 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.101001 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c084c37d-132d-466d-94d5-8176928d467e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c084c37d-132d-466d-94d5-8176928d467e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.101226 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm5nq\" (UniqueName: \"kubernetes.io/projected/c084c37d-132d-466d-94d5-8176928d467e-kube-api-access-nm5nq\") pod \"nova-cell0-conductor-0\" (UID: \"c084c37d-132d-466d-94d5-8176928d467e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.101284 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c084c37d-132d-466d-94d5-8176928d467e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c084c37d-132d-466d-94d5-8176928d467e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.203464 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c084c37d-132d-466d-94d5-8176928d467e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c084c37d-132d-466d-94d5-8176928d467e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.203604 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm5nq\" (UniqueName: \"kubernetes.io/projected/c084c37d-132d-466d-94d5-8176928d467e-kube-api-access-nm5nq\") pod \"nova-cell0-conductor-0\" (UID: \"c084c37d-132d-466d-94d5-8176928d467e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.203739 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c084c37d-132d-466d-94d5-8176928d467e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c084c37d-132d-466d-94d5-8176928d467e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.210193 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c084c37d-132d-466d-94d5-8176928d467e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c084c37d-132d-466d-94d5-8176928d467e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.215109 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c084c37d-132d-466d-94d5-8176928d467e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c084c37d-132d-466d-94d5-8176928d467e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.223740 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm5nq\" (UniqueName: \"kubernetes.io/projected/c084c37d-132d-466d-94d5-8176928d467e-kube-api-access-nm5nq\") pod \"nova-cell0-conductor-0\" (UID: \"c084c37d-132d-466d-94d5-8176928d467e\") " pod="openstack/nova-cell0-conductor-0" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.344045 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.804805 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 15:22:12 crc kubenswrapper[4931]: I1201 15:22:12.938976 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c084c37d-132d-466d-94d5-8176928d467e","Type":"ContainerStarted","Data":"8703f2326cbf01be02d8e5252704955070ca3d2e25e57f8d63c7dbdcdf9bb0f6"} Dec 01 15:22:13 crc kubenswrapper[4931]: I1201 15:22:13.947572 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c084c37d-132d-466d-94d5-8176928d467e","Type":"ContainerStarted","Data":"7e01ddeb924559111c710c6a2699b8893d392f9a0bcb6aa552f1c15980996e2c"} Dec 01 15:22:13 crc kubenswrapper[4931]: I1201 15:22:13.947999 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 01 15:22:13 crc kubenswrapper[4931]: I1201 15:22:13.971438 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.971418678 podStartE2EDuration="2.971418678s" podCreationTimestamp="2025-12-01 15:22:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:22:13.964428112 +0000 UTC m=+1280.390301779" watchObservedRunningTime="2025-12-01 15:22:13.971418678 +0000 UTC m=+1280.397292355" Dec 01 15:22:17 crc kubenswrapper[4931]: I1201 15:22:17.370780 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 01 15:22:17 crc kubenswrapper[4931]: I1201 15:22:17.841671 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-ff6b6"] Dec 01 15:22:17 crc kubenswrapper[4931]: I1201 15:22:17.842964 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:17 crc kubenswrapper[4931]: I1201 15:22:17.845870 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 01 15:22:17 crc kubenswrapper[4931]: I1201 15:22:17.850435 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 01 15:22:17 crc kubenswrapper[4931]: I1201 15:22:17.862476 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ff6b6"] Dec 01 15:22:17 crc kubenswrapper[4931]: I1201 15:22:17.899349 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-scripts\") pod \"nova-cell0-cell-mapping-ff6b6\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:17 crc kubenswrapper[4931]: I1201 15:22:17.899517 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ff6b6\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:17 crc kubenswrapper[4931]: I1201 15:22:17.899552 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-config-data\") pod \"nova-cell0-cell-mapping-ff6b6\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:17 crc kubenswrapper[4931]: I1201 15:22:17.899653 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8lbg\" (UniqueName: \"kubernetes.io/projected/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-kube-api-access-p8lbg\") pod \"nova-cell0-cell-mapping-ff6b6\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:17 crc kubenswrapper[4931]: I1201 15:22:17.988204 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 15:22:17 crc kubenswrapper[4931]: I1201 15:22:17.989889 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:22:17 crc kubenswrapper[4931]: I1201 15:22:17.993031 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.001399 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8lbg\" (UniqueName: \"kubernetes.io/projected/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-kube-api-access-p8lbg\") pod \"nova-cell0-cell-mapping-ff6b6\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.001707 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-scripts\") pod \"nova-cell0-cell-mapping-ff6b6\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.001740 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89e95fae-7af8-41ee-b86d-97804a008d69-config-data\") pod \"nova-api-0\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " pod="openstack/nova-api-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.001766 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89e95fae-7af8-41ee-b86d-97804a008d69-logs\") pod \"nova-api-0\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " pod="openstack/nova-api-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.001784 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e95fae-7af8-41ee-b86d-97804a008d69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " pod="openstack/nova-api-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.001815 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qwj8\" (UniqueName: \"kubernetes.io/projected/89e95fae-7af8-41ee-b86d-97804a008d69-kube-api-access-4qwj8\") pod \"nova-api-0\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " pod="openstack/nova-api-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.001846 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ff6b6\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.001866 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-config-data\") pod \"nova-cell0-cell-mapping-ff6b6\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.010053 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-config-data\") pod \"nova-cell0-cell-mapping-ff6b6\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.020055 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ff6b6\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.020761 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-scripts\") pod \"nova-cell0-cell-mapping-ff6b6\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.020831 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.034066 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8lbg\" (UniqueName: \"kubernetes.io/projected/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-kube-api-access-p8lbg\") pod \"nova-cell0-cell-mapping-ff6b6\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.089158 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.091795 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.094811 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.104206 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.104523 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89e95fae-7af8-41ee-b86d-97804a008d69-logs\") pod \"nova-api-0\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " pod="openstack/nova-api-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.104578 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e95fae-7af8-41ee-b86d-97804a008d69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " pod="openstack/nova-api-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.104621 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6dm8\" (UniqueName: \"kubernetes.io/projected/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-kube-api-access-t6dm8\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.104667 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qwj8\" (UniqueName: \"kubernetes.io/projected/89e95fae-7af8-41ee-b86d-97804a008d69-kube-api-access-4qwj8\") pod \"nova-api-0\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " pod="openstack/nova-api-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.104764 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.104822 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.104896 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89e95fae-7af8-41ee-b86d-97804a008d69-config-data\") pod \"nova-api-0\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " pod="openstack/nova-api-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.106767 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89e95fae-7af8-41ee-b86d-97804a008d69-logs\") pod \"nova-api-0\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " pod="openstack/nova-api-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.112076 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89e95fae-7af8-41ee-b86d-97804a008d69-config-data\") pod \"nova-api-0\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " pod="openstack/nova-api-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.149233 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qwj8\" (UniqueName: \"kubernetes.io/projected/89e95fae-7af8-41ee-b86d-97804a008d69-kube-api-access-4qwj8\") pod \"nova-api-0\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " pod="openstack/nova-api-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.150245 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e95fae-7af8-41ee-b86d-97804a008d69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " pod="openstack/nova-api-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.181634 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.208007 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.208082 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.208170 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6dm8\" (UniqueName: \"kubernetes.io/projected/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-kube-api-access-t6dm8\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.217093 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.225096 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.241025 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6dm8\" (UniqueName: \"kubernetes.io/projected/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-kube-api-access-t6dm8\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.272198 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.279208 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.284428 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.295248 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.312878 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.313018 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-config-data\") pod \"nova-scheduler-0\" (UID: \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.313131 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx6vk\" (UniqueName: \"kubernetes.io/projected/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-kube-api-access-mx6vk\") pod \"nova-scheduler-0\" (UID: \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.319880 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.324579 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.328199 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.366410 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.408551 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.424518 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-config-data\") pod \"nova-scheduler-0\" (UID: \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.427706 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx6vk\" (UniqueName: \"kubernetes.io/projected/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-kube-api-access-mx6vk\") pod \"nova-scheduler-0\" (UID: \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.427779 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4w75\" (UniqueName: \"kubernetes.io/projected/242f6f8c-594a-47ef-b53a-7c550acaabac-kube-api-access-f4w75\") pod \"nova-metadata-0\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.427855 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/242f6f8c-594a-47ef-b53a-7c550acaabac-logs\") pod \"nova-metadata-0\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.427902 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242f6f8c-594a-47ef-b53a-7c550acaabac-config-data\") pod \"nova-metadata-0\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.427940 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.427960 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242f6f8c-594a-47ef-b53a-7c550acaabac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.429464 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-9z8sn"] Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.432540 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.435697 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-9z8sn"] Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.443371 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.445466 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-config-data\") pod \"nova-scheduler-0\" (UID: \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.451540 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx6vk\" (UniqueName: \"kubernetes.io/projected/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-kube-api-access-mx6vk\") pod \"nova-scheduler-0\" (UID: \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.488734 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.532145 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-config\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.532189 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.532274 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4w75\" (UniqueName: \"kubernetes.io/projected/242f6f8c-594a-47ef-b53a-7c550acaabac-kube-api-access-f4w75\") pod \"nova-metadata-0\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.532331 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.532366 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.532401 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/242f6f8c-594a-47ef-b53a-7c550acaabac-logs\") pod \"nova-metadata-0\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.532419 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-dns-svc\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.532464 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242f6f8c-594a-47ef-b53a-7c550acaabac-config-data\") pod \"nova-metadata-0\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.532491 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242f6f8c-594a-47ef-b53a-7c550acaabac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.532633 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9kr8\" (UniqueName: \"kubernetes.io/projected/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-kube-api-access-p9kr8\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.534209 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/242f6f8c-594a-47ef-b53a-7c550acaabac-logs\") pod \"nova-metadata-0\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.539735 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242f6f8c-594a-47ef-b53a-7c550acaabac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.551882 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242f6f8c-594a-47ef-b53a-7c550acaabac-config-data\") pod \"nova-metadata-0\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.555801 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4w75\" (UniqueName: \"kubernetes.io/projected/242f6f8c-594a-47ef-b53a-7c550acaabac-kube-api-access-f4w75\") pod \"nova-metadata-0\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.634137 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9kr8\" (UniqueName: \"kubernetes.io/projected/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-kube-api-access-p9kr8\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.634821 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-config\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.634842 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.634887 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.634909 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.634955 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-dns-svc\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.635966 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-config\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.637290 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.637298 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.638759 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.640425 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.640977 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-dns-svc\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.652551 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.674017 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9kr8\" (UniqueName: \"kubernetes.io/projected/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-kube-api-access-p9kr8\") pod \"dnsmasq-dns-865f5d856f-9z8sn\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.755886 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.815822 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ff6b6"] Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.952666 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.991975 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"89e95fae-7af8-41ee-b86d-97804a008d69","Type":"ContainerStarted","Data":"a466fb481f6a47761dcdcf9b976c1682d632369f56a7fbef6f16d1ee469c9ffc"} Dec 01 15:22:18 crc kubenswrapper[4931]: I1201 15:22:18.992815 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ff6b6" event={"ID":"2d7a79d2-c2cd-4b85-a307-dc7a40d55861","Type":"ContainerStarted","Data":"95d8005d242ff83f3efa03ffefbf16284b0794390fd64379e522f1b1bd0d37a0"} Dec 01 15:22:19 crc kubenswrapper[4931]: W1201 15:22:19.033153 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d4321f4_2e0f_444e_8e4f_c1ae11c0602c.slice/crio-4bb97e87f89508323003c777d8ca848ff160d0492abe5dc6f37c7ebf1c347aaf WatchSource:0}: Error finding container 4bb97e87f89508323003c777d8ca848ff160d0492abe5dc6f37c7ebf1c347aaf: Status 404 returned error can't find the container with id 4bb97e87f89508323003c777d8ca848ff160d0492abe5dc6f37c7ebf1c347aaf Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.044597 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.107432 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l9gbm"] Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.108656 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.111745 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.116755 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l9gbm"] Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.118181 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.151417 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-scripts\") pod \"nova-cell1-conductor-db-sync-l9gbm\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.151635 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-l9gbm\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.151747 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-config-data\") pod \"nova-cell1-conductor-db-sync-l9gbm\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.151866 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk2dn\" (UniqueName: \"kubernetes.io/projected/291244d6-d533-45e1-856d-c12e69935ca7-kube-api-access-zk2dn\") pod \"nova-cell1-conductor-db-sync-l9gbm\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.248846 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.253177 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk2dn\" (UniqueName: \"kubernetes.io/projected/291244d6-d533-45e1-856d-c12e69935ca7-kube-api-access-zk2dn\") pod \"nova-cell1-conductor-db-sync-l9gbm\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.253365 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-scripts\") pod \"nova-cell1-conductor-db-sync-l9gbm\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.253470 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-l9gbm\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.253588 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-config-data\") pod \"nova-cell1-conductor-db-sync-l9gbm\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.260003 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-l9gbm\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.260015 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-config-data\") pod \"nova-cell1-conductor-db-sync-l9gbm\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.263342 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-scripts\") pod \"nova-cell1-conductor-db-sync-l9gbm\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.268737 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk2dn\" (UniqueName: \"kubernetes.io/projected/291244d6-d533-45e1-856d-c12e69935ca7-kube-api-access-zk2dn\") pod \"nova-cell1-conductor-db-sync-l9gbm\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.372870 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.445145 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-9z8sn"] Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.463840 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:19 crc kubenswrapper[4931]: I1201 15:22:19.947261 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l9gbm"] Dec 01 15:22:19 crc kubenswrapper[4931]: W1201 15:22:19.957492 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod291244d6_d533_45e1_856d_c12e69935ca7.slice/crio-6e53df1b8400e7702921213100bcc548c9242d4456c544942179ce0242988aca WatchSource:0}: Error finding container 6e53df1b8400e7702921213100bcc548c9242d4456c544942179ce0242988aca: Status 404 returned error can't find the container with id 6e53df1b8400e7702921213100bcc548c9242d4456c544942179ce0242988aca Dec 01 15:22:20 crc kubenswrapper[4931]: I1201 15:22:20.014451 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ff6b6" event={"ID":"2d7a79d2-c2cd-4b85-a307-dc7a40d55861","Type":"ContainerStarted","Data":"39e62703271991718440019903e638e081e4adde592492a519255a8d8170bb5a"} Dec 01 15:22:20 crc kubenswrapper[4931]: I1201 15:22:20.021135 4931 generic.go:334] "Generic (PLEG): container finished" podID="4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" containerID="99a0fcc476b288b4a5adbb6b77d4562d3e36faf0e9be13f15bece80f7ff98280" exitCode=0 Dec 01 15:22:20 crc kubenswrapper[4931]: I1201 15:22:20.021203 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" event={"ID":"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc","Type":"ContainerDied","Data":"99a0fcc476b288b4a5adbb6b77d4562d3e36faf0e9be13f15bece80f7ff98280"} Dec 01 15:22:20 crc kubenswrapper[4931]: I1201 15:22:20.021230 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" event={"ID":"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc","Type":"ContainerStarted","Data":"8eff5af54e133607574c21a63733a724247ec19d862a5059c2c800363bd3d3af"} Dec 01 15:22:20 crc kubenswrapper[4931]: I1201 15:22:20.026777 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c","Type":"ContainerStarted","Data":"4bb97e87f89508323003c777d8ca848ff160d0492abe5dc6f37c7ebf1c347aaf"} Dec 01 15:22:20 crc kubenswrapper[4931]: I1201 15:22:20.031302 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"242f6f8c-594a-47ef-b53a-7c550acaabac","Type":"ContainerStarted","Data":"6749cdd50fcdb6de4052b8de0f75fd9cfeb060124920f01b0086dce3aae1e9a9"} Dec 01 15:22:20 crc kubenswrapper[4931]: I1201 15:22:20.033768 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-l9gbm" event={"ID":"291244d6-d533-45e1-856d-c12e69935ca7","Type":"ContainerStarted","Data":"6e53df1b8400e7702921213100bcc548c9242d4456c544942179ce0242988aca"} Dec 01 15:22:20 crc kubenswrapper[4931]: I1201 15:22:20.038338 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f","Type":"ContainerStarted","Data":"c32f54206417dea8474527c7bf86a59574a567c5b708fb7e1fb4fc6254df307e"} Dec 01 15:22:20 crc kubenswrapper[4931]: I1201 15:22:20.060354 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-ff6b6" podStartSLOduration=3.060331596 podStartE2EDuration="3.060331596s" podCreationTimestamp="2025-12-01 15:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:22:20.041539579 +0000 UTC m=+1286.467413246" watchObservedRunningTime="2025-12-01 15:22:20.060331596 +0000 UTC m=+1286.486205273" Dec 01 15:22:21 crc kubenswrapper[4931]: I1201 15:22:21.056745 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" event={"ID":"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc","Type":"ContainerStarted","Data":"6a0ec810924d11dd5211b5762f994fb906e2e6cbccd6828f09830b848b032c15"} Dec 01 15:22:21 crc kubenswrapper[4931]: I1201 15:22:21.058267 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:21 crc kubenswrapper[4931]: I1201 15:22:21.060736 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-l9gbm" event={"ID":"291244d6-d533-45e1-856d-c12e69935ca7","Type":"ContainerStarted","Data":"6430c816fc522c0854d435f13fb666786548ec37385c0ed408d4badbc7d87a50"} Dec 01 15:22:21 crc kubenswrapper[4931]: I1201 15:22:21.087945 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" podStartSLOduration=3.087926501 podStartE2EDuration="3.087926501s" podCreationTimestamp="2025-12-01 15:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:22:21.080258716 +0000 UTC m=+1287.506132383" watchObservedRunningTime="2025-12-01 15:22:21.087926501 +0000 UTC m=+1287.513800168" Dec 01 15:22:21 crc kubenswrapper[4931]: I1201 15:22:21.100912 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-l9gbm" podStartSLOduration=2.100895934 podStartE2EDuration="2.100895934s" podCreationTimestamp="2025-12-01 15:22:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:22:21.095213765 +0000 UTC m=+1287.521087432" watchObservedRunningTime="2025-12-01 15:22:21.100895934 +0000 UTC m=+1287.526769601" Dec 01 15:22:21 crc kubenswrapper[4931]: I1201 15:22:21.777514 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:21 crc kubenswrapper[4931]: I1201 15:22:21.790911 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 15:22:23 crc kubenswrapper[4931]: I1201 15:22:23.082851 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f","Type":"ContainerStarted","Data":"a57b7f373a51d0ec2ab52a89d963afa70e30d6422a6294053a30a74fe72c9f04"} Dec 01 15:22:23 crc kubenswrapper[4931]: I1201 15:22:23.084827 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c","Type":"ContainerStarted","Data":"d3f93fc64e9553ec77673799f6285c6fbd88af0c65638bb2fb10982cee29ea3b"} Dec 01 15:22:23 crc kubenswrapper[4931]: I1201 15:22:23.084873 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6d4321f4-2e0f-444e-8e4f-c1ae11c0602c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d3f93fc64e9553ec77673799f6285c6fbd88af0c65638bb2fb10982cee29ea3b" gracePeriod=30 Dec 01 15:22:23 crc kubenswrapper[4931]: I1201 15:22:23.089031 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"89e95fae-7af8-41ee-b86d-97804a008d69","Type":"ContainerStarted","Data":"3aa8dfa379bb1187a226001d14ddfad653421f07bfc64066262afe1410554bc4"} Dec 01 15:22:23 crc kubenswrapper[4931]: I1201 15:22:23.090529 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"242f6f8c-594a-47ef-b53a-7c550acaabac","Type":"ContainerStarted","Data":"a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8"} Dec 01 15:22:23 crc kubenswrapper[4931]: I1201 15:22:23.101800 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.648515008 podStartE2EDuration="5.10178564s" podCreationTimestamp="2025-12-01 15:22:18 +0000 UTC" firstStartedPulling="2025-12-01 15:22:19.236005051 +0000 UTC m=+1285.661878718" lastFinishedPulling="2025-12-01 15:22:22.689275663 +0000 UTC m=+1289.115149350" observedRunningTime="2025-12-01 15:22:23.100811253 +0000 UTC m=+1289.526684920" watchObservedRunningTime="2025-12-01 15:22:23.10178564 +0000 UTC m=+1289.527659307" Dec 01 15:22:23 crc kubenswrapper[4931]: I1201 15:22:23.126795 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.474588971 podStartE2EDuration="5.126775241s" podCreationTimestamp="2025-12-01 15:22:18 +0000 UTC" firstStartedPulling="2025-12-01 15:22:19.035078217 +0000 UTC m=+1285.460951884" lastFinishedPulling="2025-12-01 15:22:22.687264477 +0000 UTC m=+1289.113138154" observedRunningTime="2025-12-01 15:22:23.11851816 +0000 UTC m=+1289.544391827" watchObservedRunningTime="2025-12-01 15:22:23.126775241 +0000 UTC m=+1289.552648928" Dec 01 15:22:23 crc kubenswrapper[4931]: I1201 15:22:23.403637 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 01 15:22:23 crc kubenswrapper[4931]: I1201 15:22:23.490024 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:23 crc kubenswrapper[4931]: I1201 15:22:23.639594 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.105751 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"89e95fae-7af8-41ee-b86d-97804a008d69","Type":"ContainerStarted","Data":"9c9e04c3d8a17e581d8c036f178b49e1d4bcd359ec6a3d3954cf9dc4d940df01"} Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.109285 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="242f6f8c-594a-47ef-b53a-7c550acaabac" containerName="nova-metadata-log" containerID="cri-o://a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8" gracePeriod=30 Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.109341 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="242f6f8c-594a-47ef-b53a-7c550acaabac" containerName="nova-metadata-metadata" containerID="cri-o://f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b" gracePeriod=30 Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.109312 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"242f6f8c-594a-47ef-b53a-7c550acaabac","Type":"ContainerStarted","Data":"f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b"} Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.140584 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.435119655 podStartE2EDuration="7.140553999s" podCreationTimestamp="2025-12-01 15:22:17 +0000 UTC" firstStartedPulling="2025-12-01 15:22:18.983978033 +0000 UTC m=+1285.409851700" lastFinishedPulling="2025-12-01 15:22:22.689412377 +0000 UTC m=+1289.115286044" observedRunningTime="2025-12-01 15:22:24.134159169 +0000 UTC m=+1290.560032856" watchObservedRunningTime="2025-12-01 15:22:24.140553999 +0000 UTC m=+1290.566427696" Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.166319 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.823759023 podStartE2EDuration="6.16629554s" podCreationTimestamp="2025-12-01 15:22:18 +0000 UTC" firstStartedPulling="2025-12-01 15:22:19.350297116 +0000 UTC m=+1285.776170773" lastFinishedPulling="2025-12-01 15:22:22.692833613 +0000 UTC m=+1289.118707290" observedRunningTime="2025-12-01 15:22:24.154223882 +0000 UTC m=+1290.580097559" watchObservedRunningTime="2025-12-01 15:22:24.16629554 +0000 UTC m=+1290.592169227" Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.642974 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.690632 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4w75\" (UniqueName: \"kubernetes.io/projected/242f6f8c-594a-47ef-b53a-7c550acaabac-kube-api-access-f4w75\") pod \"242f6f8c-594a-47ef-b53a-7c550acaabac\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.690757 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/242f6f8c-594a-47ef-b53a-7c550acaabac-logs\") pod \"242f6f8c-594a-47ef-b53a-7c550acaabac\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.690795 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242f6f8c-594a-47ef-b53a-7c550acaabac-combined-ca-bundle\") pod \"242f6f8c-594a-47ef-b53a-7c550acaabac\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.690851 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242f6f8c-594a-47ef-b53a-7c550acaabac-config-data\") pod \"242f6f8c-594a-47ef-b53a-7c550acaabac\" (UID: \"242f6f8c-594a-47ef-b53a-7c550acaabac\") " Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.692889 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/242f6f8c-594a-47ef-b53a-7c550acaabac-logs" (OuterVolumeSpecName: "logs") pod "242f6f8c-594a-47ef-b53a-7c550acaabac" (UID: "242f6f8c-594a-47ef-b53a-7c550acaabac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.698179 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/242f6f8c-594a-47ef-b53a-7c550acaabac-kube-api-access-f4w75" (OuterVolumeSpecName: "kube-api-access-f4w75") pod "242f6f8c-594a-47ef-b53a-7c550acaabac" (UID: "242f6f8c-594a-47ef-b53a-7c550acaabac"). InnerVolumeSpecName "kube-api-access-f4w75". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.728289 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/242f6f8c-594a-47ef-b53a-7c550acaabac-config-data" (OuterVolumeSpecName: "config-data") pod "242f6f8c-594a-47ef-b53a-7c550acaabac" (UID: "242f6f8c-594a-47ef-b53a-7c550acaabac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.731514 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/242f6f8c-594a-47ef-b53a-7c550acaabac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "242f6f8c-594a-47ef-b53a-7c550acaabac" (UID: "242f6f8c-594a-47ef-b53a-7c550acaabac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.793665 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/242f6f8c-594a-47ef-b53a-7c550acaabac-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.793708 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4w75\" (UniqueName: \"kubernetes.io/projected/242f6f8c-594a-47ef-b53a-7c550acaabac-kube-api-access-f4w75\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.793723 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/242f6f8c-594a-47ef-b53a-7c550acaabac-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:24 crc kubenswrapper[4931]: I1201 15:22:24.793732 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242f6f8c-594a-47ef-b53a-7c550acaabac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.120182 4931 generic.go:334] "Generic (PLEG): container finished" podID="242f6f8c-594a-47ef-b53a-7c550acaabac" containerID="f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b" exitCode=0 Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.120219 4931 generic.go:334] "Generic (PLEG): container finished" podID="242f6f8c-594a-47ef-b53a-7c550acaabac" containerID="a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8" exitCode=143 Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.120225 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.120274 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"242f6f8c-594a-47ef-b53a-7c550acaabac","Type":"ContainerDied","Data":"f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b"} Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.120364 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"242f6f8c-594a-47ef-b53a-7c550acaabac","Type":"ContainerDied","Data":"a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8"} Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.120396 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"242f6f8c-594a-47ef-b53a-7c550acaabac","Type":"ContainerDied","Data":"6749cdd50fcdb6de4052b8de0f75fd9cfeb060124920f01b0086dce3aae1e9a9"} Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.120417 4931 scope.go:117] "RemoveContainer" containerID="f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.146304 4931 scope.go:117] "RemoveContainer" containerID="a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.188186 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.197687 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.202930 4931 scope.go:117] "RemoveContainer" containerID="f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b" Dec 01 15:22:25 crc kubenswrapper[4931]: E1201 15:22:25.203345 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b\": container with ID starting with f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b not found: ID does not exist" containerID="f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.203505 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b"} err="failed to get container status \"f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b\": rpc error: code = NotFound desc = could not find container \"f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b\": container with ID starting with f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b not found: ID does not exist" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.203545 4931 scope.go:117] "RemoveContainer" containerID="a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8" Dec 01 15:22:25 crc kubenswrapper[4931]: E1201 15:22:25.203982 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8\": container with ID starting with a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8 not found: ID does not exist" containerID="a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.204026 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8"} err="failed to get container status \"a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8\": rpc error: code = NotFound desc = could not find container \"a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8\": container with ID starting with a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8 not found: ID does not exist" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.204051 4931 scope.go:117] "RemoveContainer" containerID="f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.204366 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b"} err="failed to get container status \"f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b\": rpc error: code = NotFound desc = could not find container \"f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b\": container with ID starting with f929a7654b10d090fa1715bfd16c275c2e162ee65a20588971fb75e3aa26dc7b not found: ID does not exist" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.204424 4931 scope.go:117] "RemoveContainer" containerID="a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.204723 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8"} err="failed to get container status \"a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8\": rpc error: code = NotFound desc = could not find container \"a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8\": container with ID starting with a8970c6cec537c9e4f935735eebeae038866015c53d90ae10e909a3091b7bcf8 not found: ID does not exist" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.208847 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:25 crc kubenswrapper[4931]: E1201 15:22:25.209206 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242f6f8c-594a-47ef-b53a-7c550acaabac" containerName="nova-metadata-log" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.209226 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="242f6f8c-594a-47ef-b53a-7c550acaabac" containerName="nova-metadata-log" Dec 01 15:22:25 crc kubenswrapper[4931]: E1201 15:22:25.209247 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242f6f8c-594a-47ef-b53a-7c550acaabac" containerName="nova-metadata-metadata" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.209255 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="242f6f8c-594a-47ef-b53a-7c550acaabac" containerName="nova-metadata-metadata" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.209449 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="242f6f8c-594a-47ef-b53a-7c550acaabac" containerName="nova-metadata-log" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.209467 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="242f6f8c-594a-47ef-b53a-7c550acaabac" containerName="nova-metadata-metadata" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.210378 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.213289 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.213634 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.219675 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.300516 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-config-data\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.300597 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.300801 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.300843 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6654\" (UniqueName: \"kubernetes.io/projected/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-kube-api-access-t6654\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.300876 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-logs\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.402090 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.402135 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6654\" (UniqueName: \"kubernetes.io/projected/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-kube-api-access-t6654\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.402163 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-logs\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.402213 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-config-data\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.402245 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.403251 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-logs\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.405812 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.406979 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.417899 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-config-data\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.421791 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6654\" (UniqueName: \"kubernetes.io/projected/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-kube-api-access-t6654\") pod \"nova-metadata-0\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.544249 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:22:25 crc kubenswrapper[4931]: I1201 15:22:25.983841 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:25 crc kubenswrapper[4931]: W1201 15:22:25.985966 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3652a998_1993_4790_8ccc_b4aa9cbaa5d9.slice/crio-8c73e79b1bb0700bb48a15f2a5ad64556df2252843f77cec919da644871a2f9b WatchSource:0}: Error finding container 8c73e79b1bb0700bb48a15f2a5ad64556df2252843f77cec919da644871a2f9b: Status 404 returned error can't find the container with id 8c73e79b1bb0700bb48a15f2a5ad64556df2252843f77cec919da644871a2f9b Dec 01 15:22:26 crc kubenswrapper[4931]: I1201 15:22:26.132055 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3652a998-1993-4790-8ccc-b4aa9cbaa5d9","Type":"ContainerStarted","Data":"8c73e79b1bb0700bb48a15f2a5ad64556df2252843f77cec919da644871a2f9b"} Dec 01 15:22:26 crc kubenswrapper[4931]: I1201 15:22:26.256125 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="242f6f8c-594a-47ef-b53a-7c550acaabac" path="/var/lib/kubelet/pods/242f6f8c-594a-47ef-b53a-7c550acaabac/volumes" Dec 01 15:22:27 crc kubenswrapper[4931]: I1201 15:22:27.142523 4931 generic.go:334] "Generic (PLEG): container finished" podID="2d7a79d2-c2cd-4b85-a307-dc7a40d55861" containerID="39e62703271991718440019903e638e081e4adde592492a519255a8d8170bb5a" exitCode=0 Dec 01 15:22:27 crc kubenswrapper[4931]: I1201 15:22:27.142592 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ff6b6" event={"ID":"2d7a79d2-c2cd-4b85-a307-dc7a40d55861","Type":"ContainerDied","Data":"39e62703271991718440019903e638e081e4adde592492a519255a8d8170bb5a"} Dec 01 15:22:27 crc kubenswrapper[4931]: I1201 15:22:27.145444 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3652a998-1993-4790-8ccc-b4aa9cbaa5d9","Type":"ContainerStarted","Data":"42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc"} Dec 01 15:22:27 crc kubenswrapper[4931]: I1201 15:22:27.145476 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3652a998-1993-4790-8ccc-b4aa9cbaa5d9","Type":"ContainerStarted","Data":"d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e"} Dec 01 15:22:27 crc kubenswrapper[4931]: I1201 15:22:27.188078 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.188055443 podStartE2EDuration="2.188055443s" podCreationTimestamp="2025-12-01 15:22:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:22:27.183464784 +0000 UTC m=+1293.609338471" watchObservedRunningTime="2025-12-01 15:22:27.188055443 +0000 UTC m=+1293.613929110" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.154171 4931 generic.go:334] "Generic (PLEG): container finished" podID="291244d6-d533-45e1-856d-c12e69935ca7" containerID="6430c816fc522c0854d435f13fb666786548ec37385c0ed408d4badbc7d87a50" exitCode=0 Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.154264 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-l9gbm" event={"ID":"291244d6-d533-45e1-856d-c12e69935ca7","Type":"ContainerDied","Data":"6430c816fc522c0854d435f13fb666786548ec37385c0ed408d4badbc7d87a50"} Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.409279 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.409656 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.521711 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.639587 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.660555 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-combined-ca-bundle\") pod \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.660629 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8lbg\" (UniqueName: \"kubernetes.io/projected/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-kube-api-access-p8lbg\") pod \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.660761 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-scripts\") pod \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.660807 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-config-data\") pod \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\" (UID: \"2d7a79d2-c2cd-4b85-a307-dc7a40d55861\") " Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.667677 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-scripts" (OuterVolumeSpecName: "scripts") pod "2d7a79d2-c2cd-4b85-a307-dc7a40d55861" (UID: "2d7a79d2-c2cd-4b85-a307-dc7a40d55861"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.668563 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-kube-api-access-p8lbg" (OuterVolumeSpecName: "kube-api-access-p8lbg") pod "2d7a79d2-c2cd-4b85-a307-dc7a40d55861" (UID: "2d7a79d2-c2cd-4b85-a307-dc7a40d55861"). InnerVolumeSpecName "kube-api-access-p8lbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.670367 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.698916 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-config-data" (OuterVolumeSpecName: "config-data") pod "2d7a79d2-c2cd-4b85-a307-dc7a40d55861" (UID: "2d7a79d2-c2cd-4b85-a307-dc7a40d55861"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.734882 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d7a79d2-c2cd-4b85-a307-dc7a40d55861" (UID: "2d7a79d2-c2cd-4b85-a307-dc7a40d55861"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.758504 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.767000 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.767050 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8lbg\" (UniqueName: \"kubernetes.io/projected/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-kube-api-access-p8lbg\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.767070 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.767087 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d7a79d2-c2cd-4b85-a307-dc7a40d55861-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.855600 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-6h4cs"] Dec 01 15:22:28 crc kubenswrapper[4931]: I1201 15:22:28.856483 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" podUID="17ca6a4a-52ed-4de1-8265-8095615b7887" containerName="dnsmasq-dns" containerID="cri-o://b2649c121ccc78a4cee5705404f61c30d44525dcdc9de0e0a6516b4d17477a75" gracePeriod=10 Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.190857 4931 generic.go:334] "Generic (PLEG): container finished" podID="17ca6a4a-52ed-4de1-8265-8095615b7887" containerID="b2649c121ccc78a4cee5705404f61c30d44525dcdc9de0e0a6516b4d17477a75" exitCode=0 Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.190888 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" event={"ID":"17ca6a4a-52ed-4de1-8265-8095615b7887","Type":"ContainerDied","Data":"b2649c121ccc78a4cee5705404f61c30d44525dcdc9de0e0a6516b4d17477a75"} Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.203941 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ff6b6" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.209728 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ff6b6" event={"ID":"2d7a79d2-c2cd-4b85-a307-dc7a40d55861","Type":"ContainerDied","Data":"95d8005d242ff83f3efa03ffefbf16284b0794390fd64379e522f1b1bd0d37a0"} Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.209768 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95d8005d242ff83f3efa03ffefbf16284b0794390fd64379e522f1b1bd0d37a0" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.245230 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.307126 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.380667 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-dns-svc\") pod \"17ca6a4a-52ed-4de1-8265-8095615b7887\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.380751 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-ovsdbserver-nb\") pod \"17ca6a4a-52ed-4de1-8265-8095615b7887\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.380801 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-dns-swift-storage-0\") pod \"17ca6a4a-52ed-4de1-8265-8095615b7887\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.380843 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8lrl\" (UniqueName: \"kubernetes.io/projected/17ca6a4a-52ed-4de1-8265-8095615b7887-kube-api-access-d8lrl\") pod \"17ca6a4a-52ed-4de1-8265-8095615b7887\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.380878 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-ovsdbserver-sb\") pod \"17ca6a4a-52ed-4de1-8265-8095615b7887\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.380899 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-config\") pod \"17ca6a4a-52ed-4de1-8265-8095615b7887\" (UID: \"17ca6a4a-52ed-4de1-8265-8095615b7887\") " Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.388728 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17ca6a4a-52ed-4de1-8265-8095615b7887-kube-api-access-d8lrl" (OuterVolumeSpecName: "kube-api-access-d8lrl") pod "17ca6a4a-52ed-4de1-8265-8095615b7887" (UID: "17ca6a4a-52ed-4de1-8265-8095615b7887"). InnerVolumeSpecName "kube-api-access-d8lrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.395619 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.397830 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="89e95fae-7af8-41ee-b86d-97804a008d69" containerName="nova-api-log" containerID="cri-o://3aa8dfa379bb1187a226001d14ddfad653421f07bfc64066262afe1410554bc4" gracePeriod=30 Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.398292 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="89e95fae-7af8-41ee-b86d-97804a008d69" containerName="nova-api-api" containerID="cri-o://9c9e04c3d8a17e581d8c036f178b49e1d4bcd359ec6a3d3954cf9dc4d940df01" gracePeriod=30 Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.408165 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="89e95fae-7af8-41ee-b86d-97804a008d69" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": EOF" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.426760 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.427106 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3652a998-1993-4790-8ccc-b4aa9cbaa5d9" containerName="nova-metadata-log" containerID="cri-o://d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e" gracePeriod=30 Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.427904 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3652a998-1993-4790-8ccc-b4aa9cbaa5d9" containerName="nova-metadata-metadata" containerID="cri-o://42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc" gracePeriod=30 Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.438730 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="89e95fae-7af8-41ee-b86d-97804a008d69" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.483041 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8lrl\" (UniqueName: \"kubernetes.io/projected/17ca6a4a-52ed-4de1-8265-8095615b7887-kube-api-access-d8lrl\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.495953 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "17ca6a4a-52ed-4de1-8265-8095615b7887" (UID: "17ca6a4a-52ed-4de1-8265-8095615b7887"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.497576 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "17ca6a4a-52ed-4de1-8265-8095615b7887" (UID: "17ca6a4a-52ed-4de1-8265-8095615b7887"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.499771 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "17ca6a4a-52ed-4de1-8265-8095615b7887" (UID: "17ca6a4a-52ed-4de1-8265-8095615b7887"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.523838 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17ca6a4a-52ed-4de1-8265-8095615b7887" (UID: "17ca6a4a-52ed-4de1-8265-8095615b7887"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.547863 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-config" (OuterVolumeSpecName: "config") pod "17ca6a4a-52ed-4de1-8265-8095615b7887" (UID: "17ca6a4a-52ed-4de1-8265-8095615b7887"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.584491 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.584540 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.584553 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.584564 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.584573 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17ca6a4a-52ed-4de1-8265-8095615b7887-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.764198 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.770177 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.788556 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-combined-ca-bundle\") pod \"291244d6-d533-45e1-856d-c12e69935ca7\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.821416 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "291244d6-d533-45e1-856d-c12e69935ca7" (UID: "291244d6-d533-45e1-856d-c12e69935ca7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.889507 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk2dn\" (UniqueName: \"kubernetes.io/projected/291244d6-d533-45e1-856d-c12e69935ca7-kube-api-access-zk2dn\") pod \"291244d6-d533-45e1-856d-c12e69935ca7\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.889578 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-scripts\") pod \"291244d6-d533-45e1-856d-c12e69935ca7\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.889952 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-config-data\") pod \"291244d6-d533-45e1-856d-c12e69935ca7\" (UID: \"291244d6-d533-45e1-856d-c12e69935ca7\") " Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.890953 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.896808 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/291244d6-d533-45e1-856d-c12e69935ca7-kube-api-access-zk2dn" (OuterVolumeSpecName: "kube-api-access-zk2dn") pod "291244d6-d533-45e1-856d-c12e69935ca7" (UID: "291244d6-d533-45e1-856d-c12e69935ca7"). InnerVolumeSpecName "kube-api-access-zk2dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.899945 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-scripts" (OuterVolumeSpecName: "scripts") pod "291244d6-d533-45e1-856d-c12e69935ca7" (UID: "291244d6-d533-45e1-856d-c12e69935ca7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.930403 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-config-data" (OuterVolumeSpecName: "config-data") pod "291244d6-d533-45e1-856d-c12e69935ca7" (UID: "291244d6-d533-45e1-856d-c12e69935ca7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.992184 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.992216 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk2dn\" (UniqueName: \"kubernetes.io/projected/291244d6-d533-45e1-856d-c12e69935ca7-kube-api-access-zk2dn\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:29 crc kubenswrapper[4931]: I1201 15:22:29.992225 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/291244d6-d533-45e1-856d-c12e69935ca7-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.019784 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.195813 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-nova-metadata-tls-certs\") pod \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.195959 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-config-data\") pod \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.196002 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-logs\") pod \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.196053 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6654\" (UniqueName: \"kubernetes.io/projected/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-kube-api-access-t6654\") pod \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.196075 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-combined-ca-bundle\") pod \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\" (UID: \"3652a998-1993-4790-8ccc-b4aa9cbaa5d9\") " Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.196510 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-logs" (OuterVolumeSpecName: "logs") pod "3652a998-1993-4790-8ccc-b4aa9cbaa5d9" (UID: "3652a998-1993-4790-8ccc-b4aa9cbaa5d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.206587 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-kube-api-access-t6654" (OuterVolumeSpecName: "kube-api-access-t6654") pod "3652a998-1993-4790-8ccc-b4aa9cbaa5d9" (UID: "3652a998-1993-4790-8ccc-b4aa9cbaa5d9"). InnerVolumeSpecName "kube-api-access-t6654". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.225175 4931 generic.go:334] "Generic (PLEG): container finished" podID="3652a998-1993-4790-8ccc-b4aa9cbaa5d9" containerID="42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc" exitCode=0 Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.225214 4931 generic.go:334] "Generic (PLEG): container finished" podID="3652a998-1993-4790-8ccc-b4aa9cbaa5d9" containerID="d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e" exitCode=143 Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.225261 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3652a998-1993-4790-8ccc-b4aa9cbaa5d9","Type":"ContainerDied","Data":"42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc"} Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.225291 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3652a998-1993-4790-8ccc-b4aa9cbaa5d9","Type":"ContainerDied","Data":"d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e"} Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.225304 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3652a998-1993-4790-8ccc-b4aa9cbaa5d9","Type":"ContainerDied","Data":"8c73e79b1bb0700bb48a15f2a5ad64556df2252843f77cec919da644871a2f9b"} Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.225324 4931 scope.go:117] "RemoveContainer" containerID="42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.225575 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.228657 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3652a998-1993-4790-8ccc-b4aa9cbaa5d9" (UID: "3652a998-1993-4790-8ccc-b4aa9cbaa5d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.236462 4931 generic.go:334] "Generic (PLEG): container finished" podID="89e95fae-7af8-41ee-b86d-97804a008d69" containerID="3aa8dfa379bb1187a226001d14ddfad653421f07bfc64066262afe1410554bc4" exitCode=143 Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.236537 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"89e95fae-7af8-41ee-b86d-97804a008d69","Type":"ContainerDied","Data":"3aa8dfa379bb1187a226001d14ddfad653421f07bfc64066262afe1410554bc4"} Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.240122 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-l9gbm" event={"ID":"291244d6-d533-45e1-856d-c12e69935ca7","Type":"ContainerDied","Data":"6e53df1b8400e7702921213100bcc548c9242d4456c544942179ce0242988aca"} Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.240252 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e53df1b8400e7702921213100bcc548c9242d4456c544942179ce0242988aca" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.240363 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-l9gbm" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.247272 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.248135 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-config-data" (OuterVolumeSpecName: "config-data") pod "3652a998-1993-4790-8ccc-b4aa9cbaa5d9" (UID: "3652a998-1993-4790-8ccc-b4aa9cbaa5d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.263520 4931 scope.go:117] "RemoveContainer" containerID="d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.274971 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-6h4cs" event={"ID":"17ca6a4a-52ed-4de1-8265-8095615b7887","Type":"ContainerDied","Data":"a0fb5a4367e410422062471ebd18e2d0c0b15c3f9c29ab5d397a63310d7fb6c8"} Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.298567 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.298596 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.298605 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6654\" (UniqueName: \"kubernetes.io/projected/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-kube-api-access-t6654\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.298616 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.308536 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3652a998-1993-4790-8ccc-b4aa9cbaa5d9" (UID: "3652a998-1993-4790-8ccc-b4aa9cbaa5d9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.324447 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 15:22:30 crc kubenswrapper[4931]: E1201 15:22:30.324862 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291244d6-d533-45e1-856d-c12e69935ca7" containerName="nova-cell1-conductor-db-sync" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.324880 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="291244d6-d533-45e1-856d-c12e69935ca7" containerName="nova-cell1-conductor-db-sync" Dec 01 15:22:30 crc kubenswrapper[4931]: E1201 15:22:30.324899 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17ca6a4a-52ed-4de1-8265-8095615b7887" containerName="dnsmasq-dns" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.324905 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="17ca6a4a-52ed-4de1-8265-8095615b7887" containerName="dnsmasq-dns" Dec 01 15:22:30 crc kubenswrapper[4931]: E1201 15:22:30.324924 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3652a998-1993-4790-8ccc-b4aa9cbaa5d9" containerName="nova-metadata-log" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.324931 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3652a998-1993-4790-8ccc-b4aa9cbaa5d9" containerName="nova-metadata-log" Dec 01 15:22:30 crc kubenswrapper[4931]: E1201 15:22:30.324951 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d7a79d2-c2cd-4b85-a307-dc7a40d55861" containerName="nova-manage" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.324956 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d7a79d2-c2cd-4b85-a307-dc7a40d55861" containerName="nova-manage" Dec 01 15:22:30 crc kubenswrapper[4931]: E1201 15:22:30.324970 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3652a998-1993-4790-8ccc-b4aa9cbaa5d9" containerName="nova-metadata-metadata" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.324976 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3652a998-1993-4790-8ccc-b4aa9cbaa5d9" containerName="nova-metadata-metadata" Dec 01 15:22:30 crc kubenswrapper[4931]: E1201 15:22:30.324987 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17ca6a4a-52ed-4de1-8265-8095615b7887" containerName="init" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.324995 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="17ca6a4a-52ed-4de1-8265-8095615b7887" containerName="init" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.325167 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d7a79d2-c2cd-4b85-a307-dc7a40d55861" containerName="nova-manage" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.325176 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="17ca6a4a-52ed-4de1-8265-8095615b7887" containerName="dnsmasq-dns" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.325195 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3652a998-1993-4790-8ccc-b4aa9cbaa5d9" containerName="nova-metadata-log" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.325203 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="291244d6-d533-45e1-856d-c12e69935ca7" containerName="nova-cell1-conductor-db-sync" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.325216 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3652a998-1993-4790-8ccc-b4aa9cbaa5d9" containerName="nova-metadata-metadata" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.325855 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.327994 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.338064 4931 scope.go:117] "RemoveContainer" containerID="42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc" Dec 01 15:22:30 crc kubenswrapper[4931]: E1201 15:22:30.338635 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc\": container with ID starting with 42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc not found: ID does not exist" containerID="42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.338689 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc"} err="failed to get container status \"42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc\": rpc error: code = NotFound desc = could not find container \"42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc\": container with ID starting with 42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc not found: ID does not exist" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.338719 4931 scope.go:117] "RemoveContainer" containerID="d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.340651 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 15:22:30 crc kubenswrapper[4931]: E1201 15:22:30.344124 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e\": container with ID starting with d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e not found: ID does not exist" containerID="d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.344157 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e"} err="failed to get container status \"d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e\": rpc error: code = NotFound desc = could not find container \"d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e\": container with ID starting with d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e not found: ID does not exist" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.344184 4931 scope.go:117] "RemoveContainer" containerID="42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.344850 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc"} err="failed to get container status \"42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc\": rpc error: code = NotFound desc = could not find container \"42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc\": container with ID starting with 42a4dfe9f1e443774c3eabc363b6c49312eadc426ea82446b999b60301a90fdc not found: ID does not exist" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.344899 4931 scope.go:117] "RemoveContainer" containerID="d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.345832 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e"} err="failed to get container status \"d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e\": rpc error: code = NotFound desc = could not find container \"d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e\": container with ID starting with d94e6a0ecb845da1c06e613bfafd8b66147c71b177d44fb6dd6755a978dbdd4e not found: ID does not exist" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.345860 4931 scope.go:117] "RemoveContainer" containerID="b2649c121ccc78a4cee5705404f61c30d44525dcdc9de0e0a6516b4d17477a75" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.352542 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-6h4cs"] Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.363803 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-6h4cs"] Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.369547 4931 scope.go:117] "RemoveContainer" containerID="f628d85b18b094f0613c32bc73bb9e82d66d9afcb95a886c8cc49d563bb9bb7b" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.401864 4931 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3652a998-1993-4790-8ccc-b4aa9cbaa5d9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.503338 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb9b8d1-45eb-45c7-af4e-64c6c020860d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dfb9b8d1-45eb-45c7-af4e-64c6c020860d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.503439 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb9b8d1-45eb-45c7-af4e-64c6c020860d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dfb9b8d1-45eb-45c7-af4e-64c6c020860d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.503537 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q27c\" (UniqueName: \"kubernetes.io/projected/dfb9b8d1-45eb-45c7-af4e-64c6c020860d-kube-api-access-6q27c\") pod \"nova-cell1-conductor-0\" (UID: \"dfb9b8d1-45eb-45c7-af4e-64c6c020860d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.562725 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.579085 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.606849 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q27c\" (UniqueName: \"kubernetes.io/projected/dfb9b8d1-45eb-45c7-af4e-64c6c020860d-kube-api-access-6q27c\") pod \"nova-cell1-conductor-0\" (UID: \"dfb9b8d1-45eb-45c7-af4e-64c6c020860d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.607485 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb9b8d1-45eb-45c7-af4e-64c6c020860d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dfb9b8d1-45eb-45c7-af4e-64c6c020860d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.607646 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb9b8d1-45eb-45c7-af4e-64c6c020860d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dfb9b8d1-45eb-45c7-af4e-64c6c020860d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.615241 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb9b8d1-45eb-45c7-af4e-64c6c020860d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dfb9b8d1-45eb-45c7-af4e-64c6c020860d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.615296 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.618063 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.619352 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb9b8d1-45eb-45c7-af4e-64c6c020860d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dfb9b8d1-45eb-45c7-af4e-64c6c020860d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.624049 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.624055 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.632464 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q27c\" (UniqueName: \"kubernetes.io/projected/dfb9b8d1-45eb-45c7-af4e-64c6c020860d-kube-api-access-6q27c\") pod \"nova-cell1-conductor-0\" (UID: \"dfb9b8d1-45eb-45c7-af4e-64c6c020860d\") " pod="openstack/nova-cell1-conductor-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.635812 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.654246 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.715464 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59913d10-be6d-4c01-850a-4da0139c28cf-logs\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.715839 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.715890 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-config-data\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.715990 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmt5r\" (UniqueName: \"kubernetes.io/projected/59913d10-be6d-4c01-850a-4da0139c28cf-kube-api-access-fmt5r\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.716044 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.817579 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmt5r\" (UniqueName: \"kubernetes.io/projected/59913d10-be6d-4c01-850a-4da0139c28cf-kube-api-access-fmt5r\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.817663 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.817709 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59913d10-be6d-4c01-850a-4da0139c28cf-logs\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.817733 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.817767 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-config-data\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.820164 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59913d10-be6d-4c01-850a-4da0139c28cf-logs\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.822663 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-config-data\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.824059 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.828939 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:30 crc kubenswrapper[4931]: I1201 15:22:30.849311 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmt5r\" (UniqueName: \"kubernetes.io/projected/59913d10-be6d-4c01-850a-4da0139c28cf-kube-api-access-fmt5r\") pod \"nova-metadata-0\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " pod="openstack/nova-metadata-0" Dec 01 15:22:31 crc kubenswrapper[4931]: I1201 15:22:31.098892 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:22:31 crc kubenswrapper[4931]: I1201 15:22:31.130488 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 15:22:31 crc kubenswrapper[4931]: W1201 15:22:31.135635 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfb9b8d1_45eb_45c7_af4e_64c6c020860d.slice/crio-a599288e9b9af4709e1512abf5c285048852f99807e7ce3e72a8911af69f31c6 WatchSource:0}: Error finding container a599288e9b9af4709e1512abf5c285048852f99807e7ce3e72a8911af69f31c6: Status 404 returned error can't find the container with id a599288e9b9af4709e1512abf5c285048852f99807e7ce3e72a8911af69f31c6 Dec 01 15:22:31 crc kubenswrapper[4931]: I1201 15:22:31.278476 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f33b0910-946f-4f79-a7a1-1b12f9dc9b3f" containerName="nova-scheduler-scheduler" containerID="cri-o://a57b7f373a51d0ec2ab52a89d963afa70e30d6422a6294053a30a74fe72c9f04" gracePeriod=30 Dec 01 15:22:31 crc kubenswrapper[4931]: I1201 15:22:31.278702 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dfb9b8d1-45eb-45c7-af4e-64c6c020860d","Type":"ContainerStarted","Data":"a599288e9b9af4709e1512abf5c285048852f99807e7ce3e72a8911af69f31c6"} Dec 01 15:22:31 crc kubenswrapper[4931]: I1201 15:22:31.631692 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:22:32 crc kubenswrapper[4931]: I1201 15:22:32.255575 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17ca6a4a-52ed-4de1-8265-8095615b7887" path="/var/lib/kubelet/pods/17ca6a4a-52ed-4de1-8265-8095615b7887/volumes" Dec 01 15:22:32 crc kubenswrapper[4931]: I1201 15:22:32.256804 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3652a998-1993-4790-8ccc-b4aa9cbaa5d9" path="/var/lib/kubelet/pods/3652a998-1993-4790-8ccc-b4aa9cbaa5d9/volumes" Dec 01 15:22:32 crc kubenswrapper[4931]: I1201 15:22:32.287927 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dfb9b8d1-45eb-45c7-af4e-64c6c020860d","Type":"ContainerStarted","Data":"ba0ae55204c9f36b0c76722bad06b017712f0ef1c28a92f1288e89ff0d3a62bc"} Dec 01 15:22:32 crc kubenswrapper[4931]: I1201 15:22:32.289823 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 01 15:22:32 crc kubenswrapper[4931]: I1201 15:22:32.291046 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59913d10-be6d-4c01-850a-4da0139c28cf","Type":"ContainerStarted","Data":"4be87e29eb7b0e2f1b334ce85d967c67215c743f395941abc648cff9bae0359f"} Dec 01 15:22:32 crc kubenswrapper[4931]: I1201 15:22:32.291150 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59913d10-be6d-4c01-850a-4da0139c28cf","Type":"ContainerStarted","Data":"684a6febf451423daf0e97a247ad25c015136ffbaf95f97f4460502b04168ad6"} Dec 01 15:22:32 crc kubenswrapper[4931]: I1201 15:22:32.291229 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59913d10-be6d-4c01-850a-4da0139c28cf","Type":"ContainerStarted","Data":"0a7a2a3c85c4d41538f7fa976380f63b6ff59965cbe8a2efe03317265fbeaffc"} Dec 01 15:22:32 crc kubenswrapper[4931]: I1201 15:22:32.316417 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.316361155 podStartE2EDuration="2.316361155s" podCreationTimestamp="2025-12-01 15:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:22:32.304894333 +0000 UTC m=+1298.730768020" watchObservedRunningTime="2025-12-01 15:22:32.316361155 +0000 UTC m=+1298.742234832" Dec 01 15:22:32 crc kubenswrapper[4931]: I1201 15:22:32.330155 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.330128941 podStartE2EDuration="2.330128941s" podCreationTimestamp="2025-12-01 15:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:22:32.323806163 +0000 UTC m=+1298.749679840" watchObservedRunningTime="2025-12-01 15:22:32.330128941 +0000 UTC m=+1298.756002608" Dec 01 15:22:33 crc kubenswrapper[4931]: I1201 15:22:33.305489 4931 generic.go:334] "Generic (PLEG): container finished" podID="306db296-7d68-4bb8-b723-782c6671f98c" containerID="74fc67e0951be39c0aabc699b6b5949b41f004e5496777839d2abe2cca7752c3" exitCode=137 Dec 01 15:22:33 crc kubenswrapper[4931]: I1201 15:22:33.306228 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"306db296-7d68-4bb8-b723-782c6671f98c","Type":"ContainerDied","Data":"74fc67e0951be39c0aabc699b6b5949b41f004e5496777839d2abe2cca7752c3"} Dec 01 15:22:33 crc kubenswrapper[4931]: E1201 15:22:33.655296 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a57b7f373a51d0ec2ab52a89d963afa70e30d6422a6294053a30a74fe72c9f04" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 15:22:33 crc kubenswrapper[4931]: E1201 15:22:33.657998 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a57b7f373a51d0ec2ab52a89d963afa70e30d6422a6294053a30a74fe72c9f04" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 15:22:33 crc kubenswrapper[4931]: E1201 15:22:33.659262 4931 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a57b7f373a51d0ec2ab52a89d963afa70e30d6422a6294053a30a74fe72c9f04" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 15:22:33 crc kubenswrapper[4931]: E1201 15:22:33.659465 4931 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f33b0910-946f-4f79-a7a1-1b12f9dc9b3f" containerName="nova-scheduler-scheduler" Dec 01 15:22:33 crc kubenswrapper[4931]: I1201 15:22:33.858696 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:22:33 crc kubenswrapper[4931]: I1201 15:22:33.891009 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-combined-ca-bundle\") pod \"306db296-7d68-4bb8-b723-782c6671f98c\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " Dec 01 15:22:33 crc kubenswrapper[4931]: I1201 15:22:33.891071 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-scripts\") pod \"306db296-7d68-4bb8-b723-782c6671f98c\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " Dec 01 15:22:33 crc kubenswrapper[4931]: I1201 15:22:33.891305 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/306db296-7d68-4bb8-b723-782c6671f98c-run-httpd\") pod \"306db296-7d68-4bb8-b723-782c6671f98c\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " Dec 01 15:22:33 crc kubenswrapper[4931]: I1201 15:22:33.891349 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/306db296-7d68-4bb8-b723-782c6671f98c-log-httpd\") pod \"306db296-7d68-4bb8-b723-782c6671f98c\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " Dec 01 15:22:33 crc kubenswrapper[4931]: I1201 15:22:33.891436 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbsrx\" (UniqueName: \"kubernetes.io/projected/306db296-7d68-4bb8-b723-782c6671f98c-kube-api-access-fbsrx\") pod \"306db296-7d68-4bb8-b723-782c6671f98c\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " Dec 01 15:22:33 crc kubenswrapper[4931]: I1201 15:22:33.891461 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-config-data\") pod \"306db296-7d68-4bb8-b723-782c6671f98c\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " Dec 01 15:22:33 crc kubenswrapper[4931]: I1201 15:22:33.891538 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-sg-core-conf-yaml\") pod \"306db296-7d68-4bb8-b723-782c6671f98c\" (UID: \"306db296-7d68-4bb8-b723-782c6671f98c\") " Dec 01 15:22:33 crc kubenswrapper[4931]: I1201 15:22:33.892610 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306db296-7d68-4bb8-b723-782c6671f98c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "306db296-7d68-4bb8-b723-782c6671f98c" (UID: "306db296-7d68-4bb8-b723-782c6671f98c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:22:33 crc kubenswrapper[4931]: I1201 15:22:33.893583 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306db296-7d68-4bb8-b723-782c6671f98c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "306db296-7d68-4bb8-b723-782c6671f98c" (UID: "306db296-7d68-4bb8-b723-782c6671f98c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.249563 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/306db296-7d68-4bb8-b723-782c6671f98c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.249601 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/306db296-7d68-4bb8-b723-782c6671f98c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.262447 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306db296-7d68-4bb8-b723-782c6671f98c-kube-api-access-fbsrx" (OuterVolumeSpecName: "kube-api-access-fbsrx") pod "306db296-7d68-4bb8-b723-782c6671f98c" (UID: "306db296-7d68-4bb8-b723-782c6671f98c"). InnerVolumeSpecName "kube-api-access-fbsrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.263247 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-scripts" (OuterVolumeSpecName: "scripts") pod "306db296-7d68-4bb8-b723-782c6671f98c" (UID: "306db296-7d68-4bb8-b723-782c6671f98c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.278653 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "306db296-7d68-4bb8-b723-782c6671f98c" (UID: "306db296-7d68-4bb8-b723-782c6671f98c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.352514 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.352570 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.352586 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbsrx\" (UniqueName: \"kubernetes.io/projected/306db296-7d68-4bb8-b723-782c6671f98c-kube-api-access-fbsrx\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.352842 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"306db296-7d68-4bb8-b723-782c6671f98c","Type":"ContainerDied","Data":"76f375561032898dea900ed3e1bc9c92b23a43e58aca0eed6996eb1a1620dc2c"} Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.352915 4931 scope.go:117] "RemoveContainer" containerID="74fc67e0951be39c0aabc699b6b5949b41f004e5496777839d2abe2cca7752c3" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.353142 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.382375 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-config-data" (OuterVolumeSpecName: "config-data") pod "306db296-7d68-4bb8-b723-782c6671f98c" (UID: "306db296-7d68-4bb8-b723-782c6671f98c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.385817 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "306db296-7d68-4bb8-b723-782c6671f98c" (UID: "306db296-7d68-4bb8-b723-782c6671f98c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.430011 4931 scope.go:117] "RemoveContainer" containerID="1dfcf138d6cfa4355c07639aa954ef6b6506e9fcbfa6587f3e09dd6d8f8b7ab1" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.461595 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.461638 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/306db296-7d68-4bb8-b723-782c6671f98c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.464403 4931 scope.go:117] "RemoveContainer" containerID="1126984e6d5e4a64f5f10155b5a365448db25ae4b1f2c3b2031560fea82db4f2" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.485703 4931 scope.go:117] "RemoveContainer" containerID="c3357a942af2ac1cab5eca60ca012c3756f0a73d2b38703faf0fd38f68e1c6e1" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.686143 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.695423 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.715975 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:22:34 crc kubenswrapper[4931]: E1201 15:22:34.716361 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="sg-core" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.719452 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="sg-core" Dec 01 15:22:34 crc kubenswrapper[4931]: E1201 15:22:34.719745 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="ceilometer-central-agent" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.719828 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="ceilometer-central-agent" Dec 01 15:22:34 crc kubenswrapper[4931]: E1201 15:22:34.719943 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="proxy-httpd" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.720015 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="proxy-httpd" Dec 01 15:22:34 crc kubenswrapper[4931]: E1201 15:22:34.720091 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="ceilometer-notification-agent" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.720177 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="ceilometer-notification-agent" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.720665 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="sg-core" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.720773 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="ceilometer-notification-agent" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.720853 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="ceilometer-central-agent" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.720940 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="306db296-7d68-4bb8-b723-782c6671f98c" containerName="proxy-httpd" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.723333 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.725811 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.726395 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.736911 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.868658 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-scripts\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.868710 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-config-data\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.868730 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25bb18d3-3005-4565-adac-25e2bdd8449c-log-httpd\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.868787 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.868853 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjs2t\" (UniqueName: \"kubernetes.io/projected/25bb18d3-3005-4565-adac-25e2bdd8449c-kube-api-access-pjs2t\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.868901 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.868924 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25bb18d3-3005-4565-adac-25e2bdd8449c-run-httpd\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.970937 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjs2t\" (UniqueName: \"kubernetes.io/projected/25bb18d3-3005-4565-adac-25e2bdd8449c-kube-api-access-pjs2t\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.971019 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.971063 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25bb18d3-3005-4565-adac-25e2bdd8449c-run-httpd\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.971853 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-scripts\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.971875 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25bb18d3-3005-4565-adac-25e2bdd8449c-run-httpd\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.971886 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-config-data\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.971958 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25bb18d3-3005-4565-adac-25e2bdd8449c-log-httpd\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.972162 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.974263 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25bb18d3-3005-4565-adac-25e2bdd8449c-log-httpd\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.976518 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.978274 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.978476 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-config-data\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.979353 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-scripts\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:34 crc kubenswrapper[4931]: I1201 15:22:34.990349 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjs2t\" (UniqueName: \"kubernetes.io/projected/25bb18d3-3005-4565-adac-25e2bdd8449c-kube-api-access-pjs2t\") pod \"ceilometer-0\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " pod="openstack/ceilometer-0" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.097411 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.233971 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.367820 4931 generic.go:334] "Generic (PLEG): container finished" podID="f33b0910-946f-4f79-a7a1-1b12f9dc9b3f" containerID="a57b7f373a51d0ec2ab52a89d963afa70e30d6422a6294053a30a74fe72c9f04" exitCode=0 Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.367914 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f","Type":"ContainerDied","Data":"a57b7f373a51d0ec2ab52a89d963afa70e30d6422a6294053a30a74fe72c9f04"} Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.370533 4931 generic.go:334] "Generic (PLEG): container finished" podID="89e95fae-7af8-41ee-b86d-97804a008d69" containerID="9c9e04c3d8a17e581d8c036f178b49e1d4bcd359ec6a3d3954cf9dc4d940df01" exitCode=0 Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.370581 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"89e95fae-7af8-41ee-b86d-97804a008d69","Type":"ContainerDied","Data":"9c9e04c3d8a17e581d8c036f178b49e1d4bcd359ec6a3d3954cf9dc4d940df01"} Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.370617 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"89e95fae-7af8-41ee-b86d-97804a008d69","Type":"ContainerDied","Data":"a466fb481f6a47761dcdcf9b976c1682d632369f56a7fbef6f16d1ee469c9ffc"} Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.370649 4931 scope.go:117] "RemoveContainer" containerID="9c9e04c3d8a17e581d8c036f178b49e1d4bcd359ec6a3d3954cf9dc4d940df01" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.370768 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.377674 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89e95fae-7af8-41ee-b86d-97804a008d69-logs\") pod \"89e95fae-7af8-41ee-b86d-97804a008d69\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.377721 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89e95fae-7af8-41ee-b86d-97804a008d69-config-data\") pod \"89e95fae-7af8-41ee-b86d-97804a008d69\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.377779 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qwj8\" (UniqueName: \"kubernetes.io/projected/89e95fae-7af8-41ee-b86d-97804a008d69-kube-api-access-4qwj8\") pod \"89e95fae-7af8-41ee-b86d-97804a008d69\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.377904 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e95fae-7af8-41ee-b86d-97804a008d69-combined-ca-bundle\") pod \"89e95fae-7af8-41ee-b86d-97804a008d69\" (UID: \"89e95fae-7af8-41ee-b86d-97804a008d69\") " Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.378985 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89e95fae-7af8-41ee-b86d-97804a008d69-logs" (OuterVolumeSpecName: "logs") pod "89e95fae-7af8-41ee-b86d-97804a008d69" (UID: "89e95fae-7af8-41ee-b86d-97804a008d69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.382083 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89e95fae-7af8-41ee-b86d-97804a008d69-kube-api-access-4qwj8" (OuterVolumeSpecName: "kube-api-access-4qwj8") pod "89e95fae-7af8-41ee-b86d-97804a008d69" (UID: "89e95fae-7af8-41ee-b86d-97804a008d69"). InnerVolumeSpecName "kube-api-access-4qwj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.394642 4931 scope.go:117] "RemoveContainer" containerID="3aa8dfa379bb1187a226001d14ddfad653421f07bfc64066262afe1410554bc4" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.407259 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89e95fae-7af8-41ee-b86d-97804a008d69-config-data" (OuterVolumeSpecName: "config-data") pod "89e95fae-7af8-41ee-b86d-97804a008d69" (UID: "89e95fae-7af8-41ee-b86d-97804a008d69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.410677 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89e95fae-7af8-41ee-b86d-97804a008d69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89e95fae-7af8-41ee-b86d-97804a008d69" (UID: "89e95fae-7af8-41ee-b86d-97804a008d69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.418336 4931 scope.go:117] "RemoveContainer" containerID="9c9e04c3d8a17e581d8c036f178b49e1d4bcd359ec6a3d3954cf9dc4d940df01" Dec 01 15:22:35 crc kubenswrapper[4931]: E1201 15:22:35.419067 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c9e04c3d8a17e581d8c036f178b49e1d4bcd359ec6a3d3954cf9dc4d940df01\": container with ID starting with 9c9e04c3d8a17e581d8c036f178b49e1d4bcd359ec6a3d3954cf9dc4d940df01 not found: ID does not exist" containerID="9c9e04c3d8a17e581d8c036f178b49e1d4bcd359ec6a3d3954cf9dc4d940df01" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.419097 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9e04c3d8a17e581d8c036f178b49e1d4bcd359ec6a3d3954cf9dc4d940df01"} err="failed to get container status \"9c9e04c3d8a17e581d8c036f178b49e1d4bcd359ec6a3d3954cf9dc4d940df01\": rpc error: code = NotFound desc = could not find container \"9c9e04c3d8a17e581d8c036f178b49e1d4bcd359ec6a3d3954cf9dc4d940df01\": container with ID starting with 9c9e04c3d8a17e581d8c036f178b49e1d4bcd359ec6a3d3954cf9dc4d940df01 not found: ID does not exist" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.419119 4931 scope.go:117] "RemoveContainer" containerID="3aa8dfa379bb1187a226001d14ddfad653421f07bfc64066262afe1410554bc4" Dec 01 15:22:35 crc kubenswrapper[4931]: E1201 15:22:35.419709 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aa8dfa379bb1187a226001d14ddfad653421f07bfc64066262afe1410554bc4\": container with ID starting with 3aa8dfa379bb1187a226001d14ddfad653421f07bfc64066262afe1410554bc4 not found: ID does not exist" containerID="3aa8dfa379bb1187a226001d14ddfad653421f07bfc64066262afe1410554bc4" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.419757 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa8dfa379bb1187a226001d14ddfad653421f07bfc64066262afe1410554bc4"} err="failed to get container status \"3aa8dfa379bb1187a226001d14ddfad653421f07bfc64066262afe1410554bc4\": rpc error: code = NotFound desc = could not find container \"3aa8dfa379bb1187a226001d14ddfad653421f07bfc64066262afe1410554bc4\": container with ID starting with 3aa8dfa379bb1187a226001d14ddfad653421f07bfc64066262afe1410554bc4 not found: ID does not exist" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.480116 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89e95fae-7af8-41ee-b86d-97804a008d69-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.480147 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89e95fae-7af8-41ee-b86d-97804a008d69-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.480158 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qwj8\" (UniqueName: \"kubernetes.io/projected/89e95fae-7af8-41ee-b86d-97804a008d69-kube-api-access-4qwj8\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.480169 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89e95fae-7af8-41ee-b86d-97804a008d69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.534625 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.535706 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 15:22:35 crc kubenswrapper[4931]: W1201 15:22:35.536748 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bb18d3_3005_4565_adac_25e2bdd8449c.slice/crio-c94277dd5de0672b6c938c456505c76c5dc92d9db04769c12be2493443e2dc73 WatchSource:0}: Error finding container c94277dd5de0672b6c938c456505c76c5dc92d9db04769c12be2493443e2dc73: Status 404 returned error can't find the container with id c94277dd5de0672b6c938c456505c76c5dc92d9db04769c12be2493443e2dc73 Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.682570 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-combined-ca-bundle\") pod \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\" (UID: \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\") " Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.682629 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-config-data\") pod \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\" (UID: \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\") " Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.682755 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx6vk\" (UniqueName: \"kubernetes.io/projected/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-kube-api-access-mx6vk\") pod \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\" (UID: \"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f\") " Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.687219 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-kube-api-access-mx6vk" (OuterVolumeSpecName: "kube-api-access-mx6vk") pod "f33b0910-946f-4f79-a7a1-1b12f9dc9b3f" (UID: "f33b0910-946f-4f79-a7a1-1b12f9dc9b3f"). InnerVolumeSpecName "kube-api-access-mx6vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.718306 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.729853 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.731938 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-config-data" (OuterVolumeSpecName: "config-data") pod "f33b0910-946f-4f79-a7a1-1b12f9dc9b3f" (UID: "f33b0910-946f-4f79-a7a1-1b12f9dc9b3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.748944 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f33b0910-946f-4f79-a7a1-1b12f9dc9b3f" (UID: "f33b0910-946f-4f79-a7a1-1b12f9dc9b3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.756048 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 15:22:35 crc kubenswrapper[4931]: E1201 15:22:35.756543 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89e95fae-7af8-41ee-b86d-97804a008d69" containerName="nova-api-log" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.756566 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e95fae-7af8-41ee-b86d-97804a008d69" containerName="nova-api-log" Dec 01 15:22:35 crc kubenswrapper[4931]: E1201 15:22:35.756593 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33b0910-946f-4f79-a7a1-1b12f9dc9b3f" containerName="nova-scheduler-scheduler" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.756601 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33b0910-946f-4f79-a7a1-1b12f9dc9b3f" containerName="nova-scheduler-scheduler" Dec 01 15:22:35 crc kubenswrapper[4931]: E1201 15:22:35.756615 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89e95fae-7af8-41ee-b86d-97804a008d69" containerName="nova-api-api" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.756621 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e95fae-7af8-41ee-b86d-97804a008d69" containerName="nova-api-api" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.756798 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f33b0910-946f-4f79-a7a1-1b12f9dc9b3f" containerName="nova-scheduler-scheduler" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.756815 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="89e95fae-7af8-41ee-b86d-97804a008d69" containerName="nova-api-api" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.756836 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="89e95fae-7af8-41ee-b86d-97804a008d69" containerName="nova-api-log" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.757821 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.759991 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.771740 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.792709 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.792741 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.792752 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx6vk\" (UniqueName: \"kubernetes.io/projected/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f-kube-api-access-mx6vk\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.894790 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bc82730-ffac-46b3-9aff-38f7697d15b1-logs\") pod \"nova-api-0\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " pod="openstack/nova-api-0" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.895129 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5cw9\" (UniqueName: \"kubernetes.io/projected/6bc82730-ffac-46b3-9aff-38f7697d15b1-kube-api-access-g5cw9\") pod \"nova-api-0\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " pod="openstack/nova-api-0" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.895246 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc82730-ffac-46b3-9aff-38f7697d15b1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " pod="openstack/nova-api-0" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.895322 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bc82730-ffac-46b3-9aff-38f7697d15b1-config-data\") pod \"nova-api-0\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " pod="openstack/nova-api-0" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.998111 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc82730-ffac-46b3-9aff-38f7697d15b1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " pod="openstack/nova-api-0" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.998159 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bc82730-ffac-46b3-9aff-38f7697d15b1-config-data\") pod \"nova-api-0\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " pod="openstack/nova-api-0" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.998252 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bc82730-ffac-46b3-9aff-38f7697d15b1-logs\") pod \"nova-api-0\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " pod="openstack/nova-api-0" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.998282 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5cw9\" (UniqueName: \"kubernetes.io/projected/6bc82730-ffac-46b3-9aff-38f7697d15b1-kube-api-access-g5cw9\") pod \"nova-api-0\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " pod="openstack/nova-api-0" Dec 01 15:22:35 crc kubenswrapper[4931]: I1201 15:22:35.999843 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bc82730-ffac-46b3-9aff-38f7697d15b1-logs\") pod \"nova-api-0\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " pod="openstack/nova-api-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.005369 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc82730-ffac-46b3-9aff-38f7697d15b1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " pod="openstack/nova-api-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.005409 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bc82730-ffac-46b3-9aff-38f7697d15b1-config-data\") pod \"nova-api-0\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " pod="openstack/nova-api-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.020699 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5cw9\" (UniqueName: \"kubernetes.io/projected/6bc82730-ffac-46b3-9aff-38f7697d15b1-kube-api-access-g5cw9\") pod \"nova-api-0\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " pod="openstack/nova-api-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.079644 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.102646 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.103062 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.256702 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="306db296-7d68-4bb8-b723-782c6671f98c" path="/var/lib/kubelet/pods/306db296-7d68-4bb8-b723-782c6671f98c/volumes" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.257895 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89e95fae-7af8-41ee-b86d-97804a008d69" path="/var/lib/kubelet/pods/89e95fae-7af8-41ee-b86d-97804a008d69/volumes" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.384473 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25bb18d3-3005-4565-adac-25e2bdd8449c","Type":"ContainerStarted","Data":"15b64bcbdcfe45cac803a6214a536f155eaa75f508cbaba34d3776e733454d4b"} Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.384825 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25bb18d3-3005-4565-adac-25e2bdd8449c","Type":"ContainerStarted","Data":"c94277dd5de0672b6c938c456505c76c5dc92d9db04769c12be2493443e2dc73"} Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.389536 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f33b0910-946f-4f79-a7a1-1b12f9dc9b3f","Type":"ContainerDied","Data":"c32f54206417dea8474527c7bf86a59574a567c5b708fb7e1fb4fc6254df307e"} Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.389601 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.389604 4931 scope.go:117] "RemoveContainer" containerID="a57b7f373a51d0ec2ab52a89d963afa70e30d6422a6294053a30a74fe72c9f04" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.426655 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.457091 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.474537 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.475794 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.477804 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.483549 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.573178 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.617458 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5975b3d-809b-40df-a28a-5831c6090edc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b5975b3d-809b-40df-a28a-5831c6090edc\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.617496 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l4w2\" (UniqueName: \"kubernetes.io/projected/b5975b3d-809b-40df-a28a-5831c6090edc-kube-api-access-7l4w2\") pod \"nova-scheduler-0\" (UID: \"b5975b3d-809b-40df-a28a-5831c6090edc\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.617556 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5975b3d-809b-40df-a28a-5831c6090edc-config-data\") pod \"nova-scheduler-0\" (UID: \"b5975b3d-809b-40df-a28a-5831c6090edc\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.718864 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5975b3d-809b-40df-a28a-5831c6090edc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b5975b3d-809b-40df-a28a-5831c6090edc\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.718912 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l4w2\" (UniqueName: \"kubernetes.io/projected/b5975b3d-809b-40df-a28a-5831c6090edc-kube-api-access-7l4w2\") pod \"nova-scheduler-0\" (UID: \"b5975b3d-809b-40df-a28a-5831c6090edc\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.718980 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5975b3d-809b-40df-a28a-5831c6090edc-config-data\") pod \"nova-scheduler-0\" (UID: \"b5975b3d-809b-40df-a28a-5831c6090edc\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.726461 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5975b3d-809b-40df-a28a-5831c6090edc-config-data\") pod \"nova-scheduler-0\" (UID: \"b5975b3d-809b-40df-a28a-5831c6090edc\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.728059 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5975b3d-809b-40df-a28a-5831c6090edc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b5975b3d-809b-40df-a28a-5831c6090edc\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.737061 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l4w2\" (UniqueName: \"kubernetes.io/projected/b5975b3d-809b-40df-a28a-5831c6090edc-kube-api-access-7l4w2\") pod \"nova-scheduler-0\" (UID: \"b5975b3d-809b-40df-a28a-5831c6090edc\") " pod="openstack/nova-scheduler-0" Dec 01 15:22:36 crc kubenswrapper[4931]: I1201 15:22:36.793269 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 15:22:37 crc kubenswrapper[4931]: I1201 15:22:37.274486 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:22:37 crc kubenswrapper[4931]: W1201 15:22:37.285247 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5975b3d_809b_40df_a28a_5831c6090edc.slice/crio-2dde04284285c641415fa9ef5b209596418a7ae70faf4976204ae24237323c5b WatchSource:0}: Error finding container 2dde04284285c641415fa9ef5b209596418a7ae70faf4976204ae24237323c5b: Status 404 returned error can't find the container with id 2dde04284285c641415fa9ef5b209596418a7ae70faf4976204ae24237323c5b Dec 01 15:22:37 crc kubenswrapper[4931]: I1201 15:22:37.404628 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25bb18d3-3005-4565-adac-25e2bdd8449c","Type":"ContainerStarted","Data":"10e2848d7f019ff3cb9b5c1ab42ac0d38bf50eb791344bed4f2cebb5eaaf0f55"} Dec 01 15:22:37 crc kubenswrapper[4931]: I1201 15:22:37.406163 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6bc82730-ffac-46b3-9aff-38f7697d15b1","Type":"ContainerStarted","Data":"c67980cd0fe18ec2300efba4def3dcf8e0a1f204b79c9f2d5bffde4197818801"} Dec 01 15:22:37 crc kubenswrapper[4931]: I1201 15:22:37.406188 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6bc82730-ffac-46b3-9aff-38f7697d15b1","Type":"ContainerStarted","Data":"4f4062028ac9e103a4efef3170136c4b4515ac34b71c56154304b340d832acd6"} Dec 01 15:22:37 crc kubenswrapper[4931]: I1201 15:22:37.406200 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6bc82730-ffac-46b3-9aff-38f7697d15b1","Type":"ContainerStarted","Data":"e559621c31fd1167b6418d20546cec5a945ad80401f2e7a4abdb1d4aa6d3b9b7"} Dec 01 15:22:37 crc kubenswrapper[4931]: I1201 15:22:37.409458 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b5975b3d-809b-40df-a28a-5831c6090edc","Type":"ContainerStarted","Data":"2dde04284285c641415fa9ef5b209596418a7ae70faf4976204ae24237323c5b"} Dec 01 15:22:37 crc kubenswrapper[4931]: I1201 15:22:37.430763 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.430741235 podStartE2EDuration="2.430741235s" podCreationTimestamp="2025-12-01 15:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:22:37.423700338 +0000 UTC m=+1303.849574015" watchObservedRunningTime="2025-12-01 15:22:37.430741235 +0000 UTC m=+1303.856614892" Dec 01 15:22:38 crc kubenswrapper[4931]: I1201 15:22:38.267837 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f33b0910-946f-4f79-a7a1-1b12f9dc9b3f" path="/var/lib/kubelet/pods/f33b0910-946f-4f79-a7a1-1b12f9dc9b3f/volumes" Dec 01 15:22:38 crc kubenswrapper[4931]: I1201 15:22:38.418896 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25bb18d3-3005-4565-adac-25e2bdd8449c","Type":"ContainerStarted","Data":"2883a4ba723e14a5c45019db5f14dda686908e8c5a78aca043af256939fdd9a8"} Dec 01 15:22:38 crc kubenswrapper[4931]: I1201 15:22:38.421171 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b5975b3d-809b-40df-a28a-5831c6090edc","Type":"ContainerStarted","Data":"6fac9f44d434c5b69a527026c9917d00afdb90905555e72c5abc7e9ebc007e0d"} Dec 01 15:22:40 crc kubenswrapper[4931]: I1201 15:22:40.440076 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25bb18d3-3005-4565-adac-25e2bdd8449c","Type":"ContainerStarted","Data":"03da02dbac4e337f07ad8ace8d68cb4da2447aee6eb9a60f8b193f91bffbce85"} Dec 01 15:22:40 crc kubenswrapper[4931]: I1201 15:22:40.441124 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 15:22:40 crc kubenswrapper[4931]: I1201 15:22:40.469158 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.469129744 podStartE2EDuration="4.469129744s" podCreationTimestamp="2025-12-01 15:22:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:22:38.442672931 +0000 UTC m=+1304.868546608" watchObservedRunningTime="2025-12-01 15:22:40.469129744 +0000 UTC m=+1306.895003441" Dec 01 15:22:40 crc kubenswrapper[4931]: I1201 15:22:40.472999 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.147210844 podStartE2EDuration="6.472983862s" podCreationTimestamp="2025-12-01 15:22:34 +0000 UTC" firstStartedPulling="2025-12-01 15:22:35.539146054 +0000 UTC m=+1301.965019731" lastFinishedPulling="2025-12-01 15:22:39.864919092 +0000 UTC m=+1306.290792749" observedRunningTime="2025-12-01 15:22:40.459123633 +0000 UTC m=+1306.884997360" watchObservedRunningTime="2025-12-01 15:22:40.472983862 +0000 UTC m=+1306.898857529" Dec 01 15:22:40 crc kubenswrapper[4931]: I1201 15:22:40.688763 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 01 15:22:41 crc kubenswrapper[4931]: I1201 15:22:41.099536 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 15:22:41 crc kubenswrapper[4931]: I1201 15:22:41.099772 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 15:22:41 crc kubenswrapper[4931]: I1201 15:22:41.794174 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 15:22:42 crc kubenswrapper[4931]: I1201 15:22:42.112693 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="59913d10-be6d-4c01-850a-4da0139c28cf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 15:22:42 crc kubenswrapper[4931]: I1201 15:22:42.112707 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="59913d10-be6d-4c01-850a-4da0139c28cf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 15:22:46 crc kubenswrapper[4931]: I1201 15:22:46.080769 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 15:22:46 crc kubenswrapper[4931]: I1201 15:22:46.081427 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 15:22:46 crc kubenswrapper[4931]: I1201 15:22:46.794993 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 15:22:46 crc kubenswrapper[4931]: I1201 15:22:46.825571 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 15:22:47 crc kubenswrapper[4931]: I1201 15:22:47.163658 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6bc82730-ffac-46b3-9aff-38f7697d15b1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 15:22:47 crc kubenswrapper[4931]: I1201 15:22:47.163680 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6bc82730-ffac-46b3-9aff-38f7697d15b1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 15:22:47 crc kubenswrapper[4931]: I1201 15:22:47.531357 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 15:22:51 crc kubenswrapper[4931]: I1201 15:22:51.105969 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 15:22:51 crc kubenswrapper[4931]: I1201 15:22:51.106996 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 15:22:51 crc kubenswrapper[4931]: I1201 15:22:51.110702 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 15:22:51 crc kubenswrapper[4931]: I1201 15:22:51.549148 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.566820 4931 generic.go:334] "Generic (PLEG): container finished" podID="6d4321f4-2e0f-444e-8e4f-c1ae11c0602c" containerID="d3f93fc64e9553ec77673799f6285c6fbd88af0c65638bb2fb10982cee29ea3b" exitCode=137 Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.568287 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c","Type":"ContainerDied","Data":"d3f93fc64e9553ec77673799f6285c6fbd88af0c65638bb2fb10982cee29ea3b"} Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.568320 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c","Type":"ContainerDied","Data":"4bb97e87f89508323003c777d8ca848ff160d0492abe5dc6f37c7ebf1c347aaf"} Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.568333 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bb97e87f89508323003c777d8ca848ff160d0492abe5dc6f37c7ebf1c347aaf" Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.570817 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.679122 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-combined-ca-bundle\") pod \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\" (UID: \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\") " Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.679182 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6dm8\" (UniqueName: \"kubernetes.io/projected/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-kube-api-access-t6dm8\") pod \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\" (UID: \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\") " Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.679287 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-config-data\") pod \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\" (UID: \"6d4321f4-2e0f-444e-8e4f-c1ae11c0602c\") " Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.689322 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-kube-api-access-t6dm8" (OuterVolumeSpecName: "kube-api-access-t6dm8") pod "6d4321f4-2e0f-444e-8e4f-c1ae11c0602c" (UID: "6d4321f4-2e0f-444e-8e4f-c1ae11c0602c"). InnerVolumeSpecName "kube-api-access-t6dm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.706100 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-config-data" (OuterVolumeSpecName: "config-data") pod "6d4321f4-2e0f-444e-8e4f-c1ae11c0602c" (UID: "6d4321f4-2e0f-444e-8e4f-c1ae11c0602c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.706735 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d4321f4-2e0f-444e-8e4f-c1ae11c0602c" (UID: "6d4321f4-2e0f-444e-8e4f-c1ae11c0602c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.783539 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.783587 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6dm8\" (UniqueName: \"kubernetes.io/projected/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-kube-api-access-t6dm8\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:53 crc kubenswrapper[4931]: I1201 15:22:53.783601 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.574659 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.600073 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.616830 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.631091 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 15:22:54 crc kubenswrapper[4931]: E1201 15:22:54.631634 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d4321f4-2e0f-444e-8e4f-c1ae11c0602c" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.631664 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4321f4-2e0f-444e-8e4f-c1ae11c0602c" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.631937 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d4321f4-2e0f-444e-8e4f-c1ae11c0602c" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.632855 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.636535 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.636942 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.637294 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.642811 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.801459 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5e7833d-55a2-4896-9ca7-610c68157f00-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.801524 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e7833d-55a2-4896-9ca7-610c68157f00-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.801567 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e7833d-55a2-4896-9ca7-610c68157f00-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.801683 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27tb4\" (UniqueName: \"kubernetes.io/projected/a5e7833d-55a2-4896-9ca7-610c68157f00-kube-api-access-27tb4\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.801769 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e7833d-55a2-4896-9ca7-610c68157f00-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.903673 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5e7833d-55a2-4896-9ca7-610c68157f00-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.903765 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e7833d-55a2-4896-9ca7-610c68157f00-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.903821 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e7833d-55a2-4896-9ca7-610c68157f00-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.903891 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27tb4\" (UniqueName: \"kubernetes.io/projected/a5e7833d-55a2-4896-9ca7-610c68157f00-kube-api-access-27tb4\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.903959 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e7833d-55a2-4896-9ca7-610c68157f00-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.909225 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e7833d-55a2-4896-9ca7-610c68157f00-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.909277 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5e7833d-55a2-4896-9ca7-610c68157f00-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.910045 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e7833d-55a2-4896-9ca7-610c68157f00-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.910885 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5e7833d-55a2-4896-9ca7-610c68157f00-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.923137 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27tb4\" (UniqueName: \"kubernetes.io/projected/a5e7833d-55a2-4896-9ca7-610c68157f00-kube-api-access-27tb4\") pod \"nova-cell1-novncproxy-0\" (UID: \"a5e7833d-55a2-4896-9ca7-610c68157f00\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:54 crc kubenswrapper[4931]: I1201 15:22:54.950685 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:22:55 crc kubenswrapper[4931]: W1201 15:22:55.414339 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5e7833d_55a2_4896_9ca7_610c68157f00.slice/crio-bd11352f084ab7f622ac4398f72a4b5b3bd404ef52a69fb2cf5abea304f46b18 WatchSource:0}: Error finding container bd11352f084ab7f622ac4398f72a4b5b3bd404ef52a69fb2cf5abea304f46b18: Status 404 returned error can't find the container with id bd11352f084ab7f622ac4398f72a4b5b3bd404ef52a69fb2cf5abea304f46b18 Dec 01 15:22:55 crc kubenswrapper[4931]: I1201 15:22:55.416012 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 15:22:55 crc kubenswrapper[4931]: I1201 15:22:55.585190 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a5e7833d-55a2-4896-9ca7-610c68157f00","Type":"ContainerStarted","Data":"bd11352f084ab7f622ac4398f72a4b5b3bd404ef52a69fb2cf5abea304f46b18"} Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.084629 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.084992 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.085355 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.085378 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.088467 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.088842 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.290061 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d4321f4-2e0f-444e-8e4f-c1ae11c0602c" path="/var/lib/kubelet/pods/6d4321f4-2e0f-444e-8e4f-c1ae11c0602c/volumes" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.311593 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-8tb7h"] Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.315268 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.339820 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-8tb7h"] Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.447653 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.447691 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-config\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.447738 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.447763 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.447805 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bk4n\" (UniqueName: \"kubernetes.io/projected/00d0b21e-97af-42df-b2f0-f137b01e4112-kube-api-access-2bk4n\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.447863 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.549865 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.549929 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-config\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.549976 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.550019 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.550053 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bk4n\" (UniqueName: \"kubernetes.io/projected/00d0b21e-97af-42df-b2f0-f137b01e4112-kube-api-access-2bk4n\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.550105 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.550976 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.551080 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.551899 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-config\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.552057 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.552248 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.593423 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bk4n\" (UniqueName: \"kubernetes.io/projected/00d0b21e-97af-42df-b2f0-f137b01e4112-kube-api-access-2bk4n\") pod \"dnsmasq-dns-5c7b6c5df9-8tb7h\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.617807 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a5e7833d-55a2-4896-9ca7-610c68157f00","Type":"ContainerStarted","Data":"c568c8886415bb9da2c7aade28d204716c631d8298c3913619c7c3e830e8f5eb"} Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.646037 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.645998956 podStartE2EDuration="2.645998956s" podCreationTimestamp="2025-12-01 15:22:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:22:56.645040219 +0000 UTC m=+1323.070913886" watchObservedRunningTime="2025-12-01 15:22:56.645998956 +0000 UTC m=+1323.071872623" Dec 01 15:22:56 crc kubenswrapper[4931]: I1201 15:22:56.649465 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:57 crc kubenswrapper[4931]: I1201 15:22:57.145335 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-8tb7h"] Dec 01 15:22:57 crc kubenswrapper[4931]: I1201 15:22:57.627141 4931 generic.go:334] "Generic (PLEG): container finished" podID="00d0b21e-97af-42df-b2f0-f137b01e4112" containerID="ab618bcdb344c41e7bd0c9f00c624a66bd34dcbe0d9b9179284ab2097166306e" exitCode=0 Dec 01 15:22:57 crc kubenswrapper[4931]: I1201 15:22:57.628717 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" event={"ID":"00d0b21e-97af-42df-b2f0-f137b01e4112","Type":"ContainerDied","Data":"ab618bcdb344c41e7bd0c9f00c624a66bd34dcbe0d9b9179284ab2097166306e"} Dec 01 15:22:57 crc kubenswrapper[4931]: I1201 15:22:57.628751 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" event={"ID":"00d0b21e-97af-42df-b2f0-f137b01e4112","Type":"ContainerStarted","Data":"edd30ac6a18164f82337bf9552e70b64a9849409e684337fe6c279520d0b9ce0"} Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.326568 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.327128 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="ceilometer-central-agent" containerID="cri-o://15b64bcbdcfe45cac803a6214a536f155eaa75f508cbaba34d3776e733454d4b" gracePeriod=30 Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.327199 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="sg-core" containerID="cri-o://2883a4ba723e14a5c45019db5f14dda686908e8c5a78aca043af256939fdd9a8" gracePeriod=30 Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.327243 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="ceilometer-notification-agent" containerID="cri-o://10e2848d7f019ff3cb9b5c1ab42ac0d38bf50eb791344bed4f2cebb5eaaf0f55" gracePeriod=30 Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.327573 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="proxy-httpd" containerID="cri-o://03da02dbac4e337f07ad8ace8d68cb4da2447aee6eb9a60f8b193f91bffbce85" gracePeriod=30 Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.427747 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.190:3000/\": read tcp 10.217.0.2:36844->10.217.0.190:3000: read: connection reset by peer" Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.636220 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" event={"ID":"00d0b21e-97af-42df-b2f0-f137b01e4112","Type":"ContainerStarted","Data":"96cf3a26f8693aaae22e2ad7cea050e3f50650147bcb399871f8afb3ac42311e"} Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.636355 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.638897 4931 generic.go:334] "Generic (PLEG): container finished" podID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerID="03da02dbac4e337f07ad8ace8d68cb4da2447aee6eb9a60f8b193f91bffbce85" exitCode=0 Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.638925 4931 generic.go:334] "Generic (PLEG): container finished" podID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerID="2883a4ba723e14a5c45019db5f14dda686908e8c5a78aca043af256939fdd9a8" exitCode=2 Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.638943 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25bb18d3-3005-4565-adac-25e2bdd8449c","Type":"ContainerDied","Data":"03da02dbac4e337f07ad8ace8d68cb4da2447aee6eb9a60f8b193f91bffbce85"} Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.638965 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25bb18d3-3005-4565-adac-25e2bdd8449c","Type":"ContainerDied","Data":"2883a4ba723e14a5c45019db5f14dda686908e8c5a78aca043af256939fdd9a8"} Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.659168 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" podStartSLOduration=2.659153196 podStartE2EDuration="2.659153196s" podCreationTimestamp="2025-12-01 15:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:22:58.653891839 +0000 UTC m=+1325.079765506" watchObservedRunningTime="2025-12-01 15:22:58.659153196 +0000 UTC m=+1325.085026863" Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.781562 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.781780 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6bc82730-ffac-46b3-9aff-38f7697d15b1" containerName="nova-api-log" containerID="cri-o://4f4062028ac9e103a4efef3170136c4b4515ac34b71c56154304b340d832acd6" gracePeriod=30 Dec 01 15:22:58 crc kubenswrapper[4931]: I1201 15:22:58.781876 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6bc82730-ffac-46b3-9aff-38f7697d15b1" containerName="nova-api-api" containerID="cri-o://c67980cd0fe18ec2300efba4def3dcf8e0a1f204b79c9f2d5bffde4197818801" gracePeriod=30 Dec 01 15:22:59 crc kubenswrapper[4931]: I1201 15:22:59.650341 4931 generic.go:334] "Generic (PLEG): container finished" podID="6bc82730-ffac-46b3-9aff-38f7697d15b1" containerID="4f4062028ac9e103a4efef3170136c4b4515ac34b71c56154304b340d832acd6" exitCode=143 Dec 01 15:22:59 crc kubenswrapper[4931]: I1201 15:22:59.650396 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6bc82730-ffac-46b3-9aff-38f7697d15b1","Type":"ContainerDied","Data":"4f4062028ac9e103a4efef3170136c4b4515ac34b71c56154304b340d832acd6"} Dec 01 15:22:59 crc kubenswrapper[4931]: I1201 15:22:59.653736 4931 generic.go:334] "Generic (PLEG): container finished" podID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerID="15b64bcbdcfe45cac803a6214a536f155eaa75f508cbaba34d3776e733454d4b" exitCode=0 Dec 01 15:22:59 crc kubenswrapper[4931]: I1201 15:22:59.653776 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25bb18d3-3005-4565-adac-25e2bdd8449c","Type":"ContainerDied","Data":"15b64bcbdcfe45cac803a6214a536f155eaa75f508cbaba34d3776e733454d4b"} Dec 01 15:22:59 crc kubenswrapper[4931]: I1201 15:22:59.951465 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.399358 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.569014 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bc82730-ffac-46b3-9aff-38f7697d15b1-config-data\") pod \"6bc82730-ffac-46b3-9aff-38f7697d15b1\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.569170 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5cw9\" (UniqueName: \"kubernetes.io/projected/6bc82730-ffac-46b3-9aff-38f7697d15b1-kube-api-access-g5cw9\") pod \"6bc82730-ffac-46b3-9aff-38f7697d15b1\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.569923 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc82730-ffac-46b3-9aff-38f7697d15b1-combined-ca-bundle\") pod \"6bc82730-ffac-46b3-9aff-38f7697d15b1\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.570101 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bc82730-ffac-46b3-9aff-38f7697d15b1-logs\") pod \"6bc82730-ffac-46b3-9aff-38f7697d15b1\" (UID: \"6bc82730-ffac-46b3-9aff-38f7697d15b1\") " Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.571949 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc82730-ffac-46b3-9aff-38f7697d15b1-logs" (OuterVolumeSpecName: "logs") pod "6bc82730-ffac-46b3-9aff-38f7697d15b1" (UID: "6bc82730-ffac-46b3-9aff-38f7697d15b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.581220 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc82730-ffac-46b3-9aff-38f7697d15b1-kube-api-access-g5cw9" (OuterVolumeSpecName: "kube-api-access-g5cw9") pod "6bc82730-ffac-46b3-9aff-38f7697d15b1" (UID: "6bc82730-ffac-46b3-9aff-38f7697d15b1"). InnerVolumeSpecName "kube-api-access-g5cw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.599959 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bc82730-ffac-46b3-9aff-38f7697d15b1-config-data" (OuterVolumeSpecName: "config-data") pod "6bc82730-ffac-46b3-9aff-38f7697d15b1" (UID: "6bc82730-ffac-46b3-9aff-38f7697d15b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.605563 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bc82730-ffac-46b3-9aff-38f7697d15b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bc82730-ffac-46b3-9aff-38f7697d15b1" (UID: "6bc82730-ffac-46b3-9aff-38f7697d15b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.672596 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5cw9\" (UniqueName: \"kubernetes.io/projected/6bc82730-ffac-46b3-9aff-38f7697d15b1-kube-api-access-g5cw9\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.672911 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc82730-ffac-46b3-9aff-38f7697d15b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.672920 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bc82730-ffac-46b3-9aff-38f7697d15b1-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.672929 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bc82730-ffac-46b3-9aff-38f7697d15b1-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.681940 4931 generic.go:334] "Generic (PLEG): container finished" podID="6bc82730-ffac-46b3-9aff-38f7697d15b1" containerID="c67980cd0fe18ec2300efba4def3dcf8e0a1f204b79c9f2d5bffde4197818801" exitCode=0 Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.681984 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6bc82730-ffac-46b3-9aff-38f7697d15b1","Type":"ContainerDied","Data":"c67980cd0fe18ec2300efba4def3dcf8e0a1f204b79c9f2d5bffde4197818801"} Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.682009 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6bc82730-ffac-46b3-9aff-38f7697d15b1","Type":"ContainerDied","Data":"e559621c31fd1167b6418d20546cec5a945ad80401f2e7a4abdb1d4aa6d3b9b7"} Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.682016 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.682025 4931 scope.go:117] "RemoveContainer" containerID="c67980cd0fe18ec2300efba4def3dcf8e0a1f204b79c9f2d5bffde4197818801" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.703767 4931 scope.go:117] "RemoveContainer" containerID="4f4062028ac9e103a4efef3170136c4b4515ac34b71c56154304b340d832acd6" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.714881 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.725257 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.732774 4931 scope.go:117] "RemoveContainer" containerID="c67980cd0fe18ec2300efba4def3dcf8e0a1f204b79c9f2d5bffde4197818801" Dec 01 15:23:02 crc kubenswrapper[4931]: E1201 15:23:02.735807 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c67980cd0fe18ec2300efba4def3dcf8e0a1f204b79c9f2d5bffde4197818801\": container with ID starting with c67980cd0fe18ec2300efba4def3dcf8e0a1f204b79c9f2d5bffde4197818801 not found: ID does not exist" containerID="c67980cd0fe18ec2300efba4def3dcf8e0a1f204b79c9f2d5bffde4197818801" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.735850 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c67980cd0fe18ec2300efba4def3dcf8e0a1f204b79c9f2d5bffde4197818801"} err="failed to get container status \"c67980cd0fe18ec2300efba4def3dcf8e0a1f204b79c9f2d5bffde4197818801\": rpc error: code = NotFound desc = could not find container \"c67980cd0fe18ec2300efba4def3dcf8e0a1f204b79c9f2d5bffde4197818801\": container with ID starting with c67980cd0fe18ec2300efba4def3dcf8e0a1f204b79c9f2d5bffde4197818801 not found: ID does not exist" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.735873 4931 scope.go:117] "RemoveContainer" containerID="4f4062028ac9e103a4efef3170136c4b4515ac34b71c56154304b340d832acd6" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.736217 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 15:23:02 crc kubenswrapper[4931]: E1201 15:23:02.736301 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f4062028ac9e103a4efef3170136c4b4515ac34b71c56154304b340d832acd6\": container with ID starting with 4f4062028ac9e103a4efef3170136c4b4515ac34b71c56154304b340d832acd6 not found: ID does not exist" containerID="4f4062028ac9e103a4efef3170136c4b4515ac34b71c56154304b340d832acd6" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.736324 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f4062028ac9e103a4efef3170136c4b4515ac34b71c56154304b340d832acd6"} err="failed to get container status \"4f4062028ac9e103a4efef3170136c4b4515ac34b71c56154304b340d832acd6\": rpc error: code = NotFound desc = could not find container \"4f4062028ac9e103a4efef3170136c4b4515ac34b71c56154304b340d832acd6\": container with ID starting with 4f4062028ac9e103a4efef3170136c4b4515ac34b71c56154304b340d832acd6 not found: ID does not exist" Dec 01 15:23:02 crc kubenswrapper[4931]: E1201 15:23:02.737051 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc82730-ffac-46b3-9aff-38f7697d15b1" containerName="nova-api-log" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.737073 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc82730-ffac-46b3-9aff-38f7697d15b1" containerName="nova-api-log" Dec 01 15:23:02 crc kubenswrapper[4931]: E1201 15:23:02.737091 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc82730-ffac-46b3-9aff-38f7697d15b1" containerName="nova-api-api" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.737098 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc82730-ffac-46b3-9aff-38f7697d15b1" containerName="nova-api-api" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.737290 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc82730-ffac-46b3-9aff-38f7697d15b1" containerName="nova-api-log" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.737312 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc82730-ffac-46b3-9aff-38f7697d15b1" containerName="nova-api-api" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.738339 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.744153 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.744545 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.744715 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.751228 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.875349 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.875566 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-config-data\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.875596 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.875624 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9snk2\" (UniqueName: \"kubernetes.io/projected/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-kube-api-access-9snk2\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.875647 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-logs\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.875712 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.977466 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-config-data\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.977519 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.977543 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9snk2\" (UniqueName: \"kubernetes.io/projected/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-kube-api-access-9snk2\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.977560 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-logs\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.977602 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.977649 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.978102 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-logs\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.982199 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.982988 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.983301 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-config-data\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.983861 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:02 crc kubenswrapper[4931]: I1201 15:23:02.996643 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9snk2\" (UniqueName: \"kubernetes.io/projected/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-kube-api-access-9snk2\") pod \"nova-api-0\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " pod="openstack/nova-api-0" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.068250 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.437059 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.591813 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-config-data\") pod \"25bb18d3-3005-4565-adac-25e2bdd8449c\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.591879 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25bb18d3-3005-4565-adac-25e2bdd8449c-log-httpd\") pod \"25bb18d3-3005-4565-adac-25e2bdd8449c\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.591921 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-scripts\") pod \"25bb18d3-3005-4565-adac-25e2bdd8449c\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.592136 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-combined-ca-bundle\") pod \"25bb18d3-3005-4565-adac-25e2bdd8449c\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.592185 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25bb18d3-3005-4565-adac-25e2bdd8449c-run-httpd\") pod \"25bb18d3-3005-4565-adac-25e2bdd8449c\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.592268 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjs2t\" (UniqueName: \"kubernetes.io/projected/25bb18d3-3005-4565-adac-25e2bdd8449c-kube-api-access-pjs2t\") pod \"25bb18d3-3005-4565-adac-25e2bdd8449c\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.592299 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-sg-core-conf-yaml\") pod \"25bb18d3-3005-4565-adac-25e2bdd8449c\" (UID: \"25bb18d3-3005-4565-adac-25e2bdd8449c\") " Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.592497 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25bb18d3-3005-4565-adac-25e2bdd8449c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "25bb18d3-3005-4565-adac-25e2bdd8449c" (UID: "25bb18d3-3005-4565-adac-25e2bdd8449c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.592806 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25bb18d3-3005-4565-adac-25e2bdd8449c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "25bb18d3-3005-4565-adac-25e2bdd8449c" (UID: "25bb18d3-3005-4565-adac-25e2bdd8449c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.592832 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25bb18d3-3005-4565-adac-25e2bdd8449c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.596125 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-scripts" (OuterVolumeSpecName: "scripts") pod "25bb18d3-3005-4565-adac-25e2bdd8449c" (UID: "25bb18d3-3005-4565-adac-25e2bdd8449c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.596603 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25bb18d3-3005-4565-adac-25e2bdd8449c-kube-api-access-pjs2t" (OuterVolumeSpecName: "kube-api-access-pjs2t") pod "25bb18d3-3005-4565-adac-25e2bdd8449c" (UID: "25bb18d3-3005-4565-adac-25e2bdd8449c"). InnerVolumeSpecName "kube-api-access-pjs2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.599430 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.631815 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "25bb18d3-3005-4565-adac-25e2bdd8449c" (UID: "25bb18d3-3005-4565-adac-25e2bdd8449c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.665469 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25bb18d3-3005-4565-adac-25e2bdd8449c" (UID: "25bb18d3-3005-4565-adac-25e2bdd8449c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.697621 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.697661 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25bb18d3-3005-4565-adac-25e2bdd8449c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.697675 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjs2t\" (UniqueName: \"kubernetes.io/projected/25bb18d3-3005-4565-adac-25e2bdd8449c-kube-api-access-pjs2t\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.697715 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.697727 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.709872 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351","Type":"ContainerStarted","Data":"682898e0e97566e2468a77c753a02781e3adbd65503e76ad52c9474560551273"} Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.714080 4931 generic.go:334] "Generic (PLEG): container finished" podID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerID="10e2848d7f019ff3cb9b5c1ab42ac0d38bf50eb791344bed4f2cebb5eaaf0f55" exitCode=0 Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.714127 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25bb18d3-3005-4565-adac-25e2bdd8449c","Type":"ContainerDied","Data":"10e2848d7f019ff3cb9b5c1ab42ac0d38bf50eb791344bed4f2cebb5eaaf0f55"} Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.714350 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25bb18d3-3005-4565-adac-25e2bdd8449c","Type":"ContainerDied","Data":"c94277dd5de0672b6c938c456505c76c5dc92d9db04769c12be2493443e2dc73"} Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.714376 4931 scope.go:117] "RemoveContainer" containerID="03da02dbac4e337f07ad8ace8d68cb4da2447aee6eb9a60f8b193f91bffbce85" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.714543 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.740330 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-config-data" (OuterVolumeSpecName: "config-data") pod "25bb18d3-3005-4565-adac-25e2bdd8449c" (UID: "25bb18d3-3005-4565-adac-25e2bdd8449c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.741508 4931 scope.go:117] "RemoveContainer" containerID="2883a4ba723e14a5c45019db5f14dda686908e8c5a78aca043af256939fdd9a8" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.759920 4931 scope.go:117] "RemoveContainer" containerID="10e2848d7f019ff3cb9b5c1ab42ac0d38bf50eb791344bed4f2cebb5eaaf0f55" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.779530 4931 scope.go:117] "RemoveContainer" containerID="15b64bcbdcfe45cac803a6214a536f155eaa75f508cbaba34d3776e733454d4b" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.797075 4931 scope.go:117] "RemoveContainer" containerID="03da02dbac4e337f07ad8ace8d68cb4da2447aee6eb9a60f8b193f91bffbce85" Dec 01 15:23:03 crc kubenswrapper[4931]: E1201 15:23:03.797416 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03da02dbac4e337f07ad8ace8d68cb4da2447aee6eb9a60f8b193f91bffbce85\": container with ID starting with 03da02dbac4e337f07ad8ace8d68cb4da2447aee6eb9a60f8b193f91bffbce85 not found: ID does not exist" containerID="03da02dbac4e337f07ad8ace8d68cb4da2447aee6eb9a60f8b193f91bffbce85" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.797454 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03da02dbac4e337f07ad8ace8d68cb4da2447aee6eb9a60f8b193f91bffbce85"} err="failed to get container status \"03da02dbac4e337f07ad8ace8d68cb4da2447aee6eb9a60f8b193f91bffbce85\": rpc error: code = NotFound desc = could not find container \"03da02dbac4e337f07ad8ace8d68cb4da2447aee6eb9a60f8b193f91bffbce85\": container with ID starting with 03da02dbac4e337f07ad8ace8d68cb4da2447aee6eb9a60f8b193f91bffbce85 not found: ID does not exist" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.797480 4931 scope.go:117] "RemoveContainer" containerID="2883a4ba723e14a5c45019db5f14dda686908e8c5a78aca043af256939fdd9a8" Dec 01 15:23:03 crc kubenswrapper[4931]: E1201 15:23:03.798014 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2883a4ba723e14a5c45019db5f14dda686908e8c5a78aca043af256939fdd9a8\": container with ID starting with 2883a4ba723e14a5c45019db5f14dda686908e8c5a78aca043af256939fdd9a8 not found: ID does not exist" containerID="2883a4ba723e14a5c45019db5f14dda686908e8c5a78aca043af256939fdd9a8" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.798073 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2883a4ba723e14a5c45019db5f14dda686908e8c5a78aca043af256939fdd9a8"} err="failed to get container status \"2883a4ba723e14a5c45019db5f14dda686908e8c5a78aca043af256939fdd9a8\": rpc error: code = NotFound desc = could not find container \"2883a4ba723e14a5c45019db5f14dda686908e8c5a78aca043af256939fdd9a8\": container with ID starting with 2883a4ba723e14a5c45019db5f14dda686908e8c5a78aca043af256939fdd9a8 not found: ID does not exist" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.798115 4931 scope.go:117] "RemoveContainer" containerID="10e2848d7f019ff3cb9b5c1ab42ac0d38bf50eb791344bed4f2cebb5eaaf0f55" Dec 01 15:23:03 crc kubenswrapper[4931]: E1201 15:23:03.798512 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e2848d7f019ff3cb9b5c1ab42ac0d38bf50eb791344bed4f2cebb5eaaf0f55\": container with ID starting with 10e2848d7f019ff3cb9b5c1ab42ac0d38bf50eb791344bed4f2cebb5eaaf0f55 not found: ID does not exist" containerID="10e2848d7f019ff3cb9b5c1ab42ac0d38bf50eb791344bed4f2cebb5eaaf0f55" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.798552 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e2848d7f019ff3cb9b5c1ab42ac0d38bf50eb791344bed4f2cebb5eaaf0f55"} err="failed to get container status \"10e2848d7f019ff3cb9b5c1ab42ac0d38bf50eb791344bed4f2cebb5eaaf0f55\": rpc error: code = NotFound desc = could not find container \"10e2848d7f019ff3cb9b5c1ab42ac0d38bf50eb791344bed4f2cebb5eaaf0f55\": container with ID starting with 10e2848d7f019ff3cb9b5c1ab42ac0d38bf50eb791344bed4f2cebb5eaaf0f55 not found: ID does not exist" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.798580 4931 scope.go:117] "RemoveContainer" containerID="15b64bcbdcfe45cac803a6214a536f155eaa75f508cbaba34d3776e733454d4b" Dec 01 15:23:03 crc kubenswrapper[4931]: E1201 15:23:03.798832 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b64bcbdcfe45cac803a6214a536f155eaa75f508cbaba34d3776e733454d4b\": container with ID starting with 15b64bcbdcfe45cac803a6214a536f155eaa75f508cbaba34d3776e733454d4b not found: ID does not exist" containerID="15b64bcbdcfe45cac803a6214a536f155eaa75f508cbaba34d3776e733454d4b" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.798866 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b64bcbdcfe45cac803a6214a536f155eaa75f508cbaba34d3776e733454d4b"} err="failed to get container status \"15b64bcbdcfe45cac803a6214a536f155eaa75f508cbaba34d3776e733454d4b\": rpc error: code = NotFound desc = could not find container \"15b64bcbdcfe45cac803a6214a536f155eaa75f508cbaba34d3776e733454d4b\": container with ID starting with 15b64bcbdcfe45cac803a6214a536f155eaa75f508cbaba34d3776e733454d4b not found: ID does not exist" Dec 01 15:23:03 crc kubenswrapper[4931]: I1201 15:23:03.798883 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bb18d3-3005-4565-adac-25e2bdd8449c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.085125 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.093749 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.107102 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:23:04 crc kubenswrapper[4931]: E1201 15:23:04.107588 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="ceilometer-notification-agent" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.107616 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="ceilometer-notification-agent" Dec 01 15:23:04 crc kubenswrapper[4931]: E1201 15:23:04.107638 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="sg-core" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.107646 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="sg-core" Dec 01 15:23:04 crc kubenswrapper[4931]: E1201 15:23:04.107664 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="proxy-httpd" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.107672 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="proxy-httpd" Dec 01 15:23:04 crc kubenswrapper[4931]: E1201 15:23:04.107707 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="ceilometer-central-agent" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.107714 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="ceilometer-central-agent" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.108114 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="proxy-httpd" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.108143 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="ceilometer-central-agent" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.108161 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="sg-core" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.108176 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" containerName="ceilometer-notification-agent" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.112482 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.114717 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.114908 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.119788 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.207242 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvlrl\" (UniqueName: \"kubernetes.io/projected/25133581-9318-4292-acb4-38574d51628a-kube-api-access-mvlrl\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.207321 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-scripts\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.207373 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25133581-9318-4292-acb4-38574d51628a-run-httpd\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.207425 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25133581-9318-4292-acb4-38574d51628a-log-httpd\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.207460 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.207709 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-config-data\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.207876 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.253334 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25bb18d3-3005-4565-adac-25e2bdd8449c" path="/var/lib/kubelet/pods/25bb18d3-3005-4565-adac-25e2bdd8449c/volumes" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.254164 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc82730-ffac-46b3-9aff-38f7697d15b1" path="/var/lib/kubelet/pods/6bc82730-ffac-46b3-9aff-38f7697d15b1/volumes" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.310230 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25133581-9318-4292-acb4-38574d51628a-run-httpd\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.310569 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25133581-9318-4292-acb4-38574d51628a-log-httpd\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.310687 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25133581-9318-4292-acb4-38574d51628a-run-httpd\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.310830 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.311017 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-config-data\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.311623 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.311783 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvlrl\" (UniqueName: \"kubernetes.io/projected/25133581-9318-4292-acb4-38574d51628a-kube-api-access-mvlrl\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.311888 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-scripts\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.311046 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25133581-9318-4292-acb4-38574d51628a-log-httpd\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.314812 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.315632 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.316184 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-scripts\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.326023 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-config-data\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.348857 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvlrl\" (UniqueName: \"kubernetes.io/projected/25133581-9318-4292-acb4-38574d51628a-kube-api-access-mvlrl\") pod \"ceilometer-0\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.445198 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.726170 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351","Type":"ContainerStarted","Data":"7568bdc3c5c656ea08c913302c3130584d4c27e5f1c09043777c7024adac4fbe"} Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.726572 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351","Type":"ContainerStarted","Data":"78d0a1238992c635737d59615b4cc37e4417ed3d554e3adb2cacdec3369f40c5"} Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.757645 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.757625932 podStartE2EDuration="2.757625932s" podCreationTimestamp="2025-12-01 15:23:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:23:04.74578534 +0000 UTC m=+1331.171659007" watchObservedRunningTime="2025-12-01 15:23:04.757625932 +0000 UTC m=+1331.183499599" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.876562 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:23:04 crc kubenswrapper[4931]: W1201 15:23:04.884137 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25133581_9318_4292_acb4_38574d51628a.slice/crio-e5551b865533a7f621e75ccf393ed7b8bcfca358c07f8ba36f1ecb16db76ec82 WatchSource:0}: Error finding container e5551b865533a7f621e75ccf393ed7b8bcfca358c07f8ba36f1ecb16db76ec82: Status 404 returned error can't find the container with id e5551b865533a7f621e75ccf393ed7b8bcfca358c07f8ba36f1ecb16db76ec82 Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.950969 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:23:04 crc kubenswrapper[4931]: I1201 15:23:04.978074 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:23:05 crc kubenswrapper[4931]: I1201 15:23:05.744841 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25133581-9318-4292-acb4-38574d51628a","Type":"ContainerStarted","Data":"e5551b865533a7f621e75ccf393ed7b8bcfca358c07f8ba36f1ecb16db76ec82"} Dec 01 15:23:05 crc kubenswrapper[4931]: I1201 15:23:05.768920 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 01 15:23:05 crc kubenswrapper[4931]: I1201 15:23:05.992575 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-t2sgb"] Dec 01 15:23:05 crc kubenswrapper[4931]: I1201 15:23:05.994085 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.000867 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.001032 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.005824 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-t2sgb"] Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.148063 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-config-data\") pod \"nova-cell1-cell-mapping-t2sgb\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.148217 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cttsd\" (UniqueName: \"kubernetes.io/projected/6da4dc84-4773-4ff6-878d-e042e108cb65-kube-api-access-cttsd\") pod \"nova-cell1-cell-mapping-t2sgb\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.148288 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-scripts\") pod \"nova-cell1-cell-mapping-t2sgb\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.148352 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t2sgb\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.250052 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t2sgb\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.250106 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-config-data\") pod \"nova-cell1-cell-mapping-t2sgb\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.250201 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cttsd\" (UniqueName: \"kubernetes.io/projected/6da4dc84-4773-4ff6-878d-e042e108cb65-kube-api-access-cttsd\") pod \"nova-cell1-cell-mapping-t2sgb\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.250237 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-scripts\") pod \"nova-cell1-cell-mapping-t2sgb\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.260753 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-scripts\") pod \"nova-cell1-cell-mapping-t2sgb\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.262523 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-config-data\") pod \"nova-cell1-cell-mapping-t2sgb\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.263216 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-t2sgb\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.275303 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cttsd\" (UniqueName: \"kubernetes.io/projected/6da4dc84-4773-4ff6-878d-e042e108cb65-kube-api-access-cttsd\") pod \"nova-cell1-cell-mapping-t2sgb\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.313207 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.650517 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.707120 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-9z8sn"] Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.707365 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" podUID="4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" containerName="dnsmasq-dns" containerID="cri-o://6a0ec810924d11dd5211b5762f994fb906e2e6cbccd6828f09830b848b032c15" gracePeriod=10 Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.763609 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25133581-9318-4292-acb4-38574d51628a","Type":"ContainerStarted","Data":"fc3576d96c401d3c6ca1b16f28423ab1c9a3427f74238dcd88f74a17b5011404"} Dec 01 15:23:06 crc kubenswrapper[4931]: I1201 15:23:06.807185 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-t2sgb"] Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.146456 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.267541 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-ovsdbserver-nb\") pod \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.267628 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-ovsdbserver-sb\") pod \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.267697 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-config\") pod \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.267739 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-dns-swift-storage-0\") pod \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.267764 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-dns-svc\") pod \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.267838 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9kr8\" (UniqueName: \"kubernetes.io/projected/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-kube-api-access-p9kr8\") pod \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\" (UID: \"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc\") " Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.284847 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-kube-api-access-p9kr8" (OuterVolumeSpecName: "kube-api-access-p9kr8") pod "4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" (UID: "4c20a0b0-6763-49c4-a384-ce95d4c1e6cc"). InnerVolumeSpecName "kube-api-access-p9kr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.332820 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-config" (OuterVolumeSpecName: "config") pod "4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" (UID: "4c20a0b0-6763-49c4-a384-ce95d4c1e6cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.339046 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" (UID: "4c20a0b0-6763-49c4-a384-ce95d4c1e6cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.346142 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" (UID: "4c20a0b0-6763-49c4-a384-ce95d4c1e6cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.369686 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9kr8\" (UniqueName: \"kubernetes.io/projected/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-kube-api-access-p9kr8\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.369721 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.369730 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.369739 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.396962 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" (UID: "4c20a0b0-6763-49c4-a384-ce95d4c1e6cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.403566 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" (UID: "4c20a0b0-6763-49c4-a384-ce95d4c1e6cc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.471429 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.471467 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.709902 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.777698 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25133581-9318-4292-acb4-38574d51628a","Type":"ContainerStarted","Data":"3a8d5c86a19828e97274c967b246642fbe0f2bab6277e0aee0736bfad8898a20"} Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.777747 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25133581-9318-4292-acb4-38574d51628a","Type":"ContainerStarted","Data":"5896f2992f743546cd7bbbc5e825ac5373a1d513237a46a63b7d0f0352f25088"} Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.779729 4931 generic.go:334] "Generic (PLEG): container finished" podID="4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" containerID="6a0ec810924d11dd5211b5762f994fb906e2e6cbccd6828f09830b848b032c15" exitCode=0 Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.779781 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" event={"ID":"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc","Type":"ContainerDied","Data":"6a0ec810924d11dd5211b5762f994fb906e2e6cbccd6828f09830b848b032c15"} Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.779803 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" event={"ID":"4c20a0b0-6763-49c4-a384-ce95d4c1e6cc","Type":"ContainerDied","Data":"8eff5af54e133607574c21a63733a724247ec19d862a5059c2c800363bd3d3af"} Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.779848 4931 scope.go:117] "RemoveContainer" containerID="6a0ec810924d11dd5211b5762f994fb906e2e6cbccd6828f09830b848b032c15" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.779996 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-9z8sn" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.787362 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t2sgb" event={"ID":"6da4dc84-4773-4ff6-878d-e042e108cb65","Type":"ContainerStarted","Data":"75d95c7c0b9d6b902491f9421e3295bea94221c088887f27ab37c1ec59319e65"} Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.787426 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t2sgb" event={"ID":"6da4dc84-4773-4ff6-878d-e042e108cb65","Type":"ContainerStarted","Data":"692cd4ad8634d7c2a07251b5446471b3c1797222a1424bc864916f2affbaad13"} Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.819751 4931 scope.go:117] "RemoveContainer" containerID="99a0fcc476b288b4a5adbb6b77d4562d3e36faf0e9be13f15bece80f7ff98280" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.826643 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-t2sgb" podStartSLOduration=2.826624049 podStartE2EDuration="2.826624049s" podCreationTimestamp="2025-12-01 15:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:23:07.805184128 +0000 UTC m=+1334.231057795" watchObservedRunningTime="2025-12-01 15:23:07.826624049 +0000 UTC m=+1334.252497716" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.839446 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-9z8sn"] Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.846207 4931 scope.go:117] "RemoveContainer" containerID="6a0ec810924d11dd5211b5762f994fb906e2e6cbccd6828f09830b848b032c15" Dec 01 15:23:07 crc kubenswrapper[4931]: E1201 15:23:07.846653 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0ec810924d11dd5211b5762f994fb906e2e6cbccd6828f09830b848b032c15\": container with ID starting with 6a0ec810924d11dd5211b5762f994fb906e2e6cbccd6828f09830b848b032c15 not found: ID does not exist" containerID="6a0ec810924d11dd5211b5762f994fb906e2e6cbccd6828f09830b848b032c15" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.846695 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0ec810924d11dd5211b5762f994fb906e2e6cbccd6828f09830b848b032c15"} err="failed to get container status \"6a0ec810924d11dd5211b5762f994fb906e2e6cbccd6828f09830b848b032c15\": rpc error: code = NotFound desc = could not find container \"6a0ec810924d11dd5211b5762f994fb906e2e6cbccd6828f09830b848b032c15\": container with ID starting with 6a0ec810924d11dd5211b5762f994fb906e2e6cbccd6828f09830b848b032c15 not found: ID does not exist" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.846719 4931 scope.go:117] "RemoveContainer" containerID="99a0fcc476b288b4a5adbb6b77d4562d3e36faf0e9be13f15bece80f7ff98280" Dec 01 15:23:07 crc kubenswrapper[4931]: E1201 15:23:07.847110 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99a0fcc476b288b4a5adbb6b77d4562d3e36faf0e9be13f15bece80f7ff98280\": container with ID starting with 99a0fcc476b288b4a5adbb6b77d4562d3e36faf0e9be13f15bece80f7ff98280 not found: ID does not exist" containerID="99a0fcc476b288b4a5adbb6b77d4562d3e36faf0e9be13f15bece80f7ff98280" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.847137 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99a0fcc476b288b4a5adbb6b77d4562d3e36faf0e9be13f15bece80f7ff98280"} err="failed to get container status \"99a0fcc476b288b4a5adbb6b77d4562d3e36faf0e9be13f15bece80f7ff98280\": rpc error: code = NotFound desc = could not find container \"99a0fcc476b288b4a5adbb6b77d4562d3e36faf0e9be13f15bece80f7ff98280\": container with ID starting with 99a0fcc476b288b4a5adbb6b77d4562d3e36faf0e9be13f15bece80f7ff98280 not found: ID does not exist" Dec 01 15:23:07 crc kubenswrapper[4931]: I1201 15:23:07.848667 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-9z8sn"] Dec 01 15:23:08 crc kubenswrapper[4931]: I1201 15:23:08.252613 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" path="/var/lib/kubelet/pods/4c20a0b0-6763-49c4-a384-ce95d4c1e6cc/volumes" Dec 01 15:23:11 crc kubenswrapper[4931]: I1201 15:23:11.836612 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25133581-9318-4292-acb4-38574d51628a","Type":"ContainerStarted","Data":"d9a5a01e7c3141be3007b4db84739a26170feab8b80d58a005f4e9d5e18d7728"} Dec 01 15:23:11 crc kubenswrapper[4931]: I1201 15:23:11.837245 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 15:23:11 crc kubenswrapper[4931]: I1201 15:23:11.837932 4931 generic.go:334] "Generic (PLEG): container finished" podID="6da4dc84-4773-4ff6-878d-e042e108cb65" containerID="75d95c7c0b9d6b902491f9421e3295bea94221c088887f27ab37c1ec59319e65" exitCode=0 Dec 01 15:23:11 crc kubenswrapper[4931]: I1201 15:23:11.837976 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t2sgb" event={"ID":"6da4dc84-4773-4ff6-878d-e042e108cb65","Type":"ContainerDied","Data":"75d95c7c0b9d6b902491f9421e3295bea94221c088887f27ab37c1ec59319e65"} Dec 01 15:23:11 crc kubenswrapper[4931]: I1201 15:23:11.868966 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.169002388 podStartE2EDuration="7.868946359s" podCreationTimestamp="2025-12-01 15:23:04 +0000 UTC" firstStartedPulling="2025-12-01 15:23:04.8870045 +0000 UTC m=+1331.312878167" lastFinishedPulling="2025-12-01 15:23:10.586948471 +0000 UTC m=+1337.012822138" observedRunningTime="2025-12-01 15:23:11.857358624 +0000 UTC m=+1338.283232311" watchObservedRunningTime="2025-12-01 15:23:11.868946359 +0000 UTC m=+1338.294820036" Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.069287 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.069660 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.234662 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.293531 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-config-data\") pod \"6da4dc84-4773-4ff6-878d-e042e108cb65\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.293750 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-scripts\") pod \"6da4dc84-4773-4ff6-878d-e042e108cb65\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.293796 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cttsd\" (UniqueName: \"kubernetes.io/projected/6da4dc84-4773-4ff6-878d-e042e108cb65-kube-api-access-cttsd\") pod \"6da4dc84-4773-4ff6-878d-e042e108cb65\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.293833 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-combined-ca-bundle\") pod \"6da4dc84-4773-4ff6-878d-e042e108cb65\" (UID: \"6da4dc84-4773-4ff6-878d-e042e108cb65\") " Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.301829 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da4dc84-4773-4ff6-878d-e042e108cb65-kube-api-access-cttsd" (OuterVolumeSpecName: "kube-api-access-cttsd") pod "6da4dc84-4773-4ff6-878d-e042e108cb65" (UID: "6da4dc84-4773-4ff6-878d-e042e108cb65"). InnerVolumeSpecName "kube-api-access-cttsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.320555 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-scripts" (OuterVolumeSpecName: "scripts") pod "6da4dc84-4773-4ff6-878d-e042e108cb65" (UID: "6da4dc84-4773-4ff6-878d-e042e108cb65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.324119 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-config-data" (OuterVolumeSpecName: "config-data") pod "6da4dc84-4773-4ff6-878d-e042e108cb65" (UID: "6da4dc84-4773-4ff6-878d-e042e108cb65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.324601 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6da4dc84-4773-4ff6-878d-e042e108cb65" (UID: "6da4dc84-4773-4ff6-878d-e042e108cb65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.395836 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.395880 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.395892 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6da4dc84-4773-4ff6-878d-e042e108cb65-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.395902 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cttsd\" (UniqueName: \"kubernetes.io/projected/6da4dc84-4773-4ff6-878d-e042e108cb65-kube-api-access-cttsd\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.872915 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-t2sgb" event={"ID":"6da4dc84-4773-4ff6-878d-e042e108cb65","Type":"ContainerDied","Data":"692cd4ad8634d7c2a07251b5446471b3c1797222a1424bc864916f2affbaad13"} Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.873225 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="692cd4ad8634d7c2a07251b5446471b3c1797222a1424bc864916f2affbaad13" Dec 01 15:23:13 crc kubenswrapper[4931]: I1201 15:23:13.873286 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-t2sgb" Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.080255 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.080545 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" containerName="nova-api-log" containerID="cri-o://78d0a1238992c635737d59615b4cc37e4417ed3d554e3adb2cacdec3369f40c5" gracePeriod=30 Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.080691 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" containerName="nova-api-api" containerID="cri-o://7568bdc3c5c656ea08c913302c3130584d4c27e5f1c09043777c7024adac4fbe" gracePeriod=30 Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.084710 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.195:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.084825 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.195:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.096772 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.097105 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b5975b3d-809b-40df-a28a-5831c6090edc" containerName="nova-scheduler-scheduler" containerID="cri-o://6fac9f44d434c5b69a527026c9917d00afdb90905555e72c5abc7e9ebc007e0d" gracePeriod=30 Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.145498 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.145826 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="59913d10-be6d-4c01-850a-4da0139c28cf" containerName="nova-metadata-log" containerID="cri-o://684a6febf451423daf0e97a247ad25c015136ffbaf95f97f4460502b04168ad6" gracePeriod=30 Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.145909 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="59913d10-be6d-4c01-850a-4da0139c28cf" containerName="nova-metadata-metadata" containerID="cri-o://4be87e29eb7b0e2f1b334ce85d967c67215c743f395941abc648cff9bae0359f" gracePeriod=30 Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.883748 4931 generic.go:334] "Generic (PLEG): container finished" podID="fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" containerID="78d0a1238992c635737d59615b4cc37e4417ed3d554e3adb2cacdec3369f40c5" exitCode=143 Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.883827 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351","Type":"ContainerDied","Data":"78d0a1238992c635737d59615b4cc37e4417ed3d554e3adb2cacdec3369f40c5"} Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.887714 4931 generic.go:334] "Generic (PLEG): container finished" podID="59913d10-be6d-4c01-850a-4da0139c28cf" containerID="684a6febf451423daf0e97a247ad25c015136ffbaf95f97f4460502b04168ad6" exitCode=143 Dec 01 15:23:14 crc kubenswrapper[4931]: I1201 15:23:14.887743 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59913d10-be6d-4c01-850a-4da0139c28cf","Type":"ContainerDied","Data":"684a6febf451423daf0e97a247ad25c015136ffbaf95f97f4460502b04168ad6"} Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.791550 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.846064 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l4w2\" (UniqueName: \"kubernetes.io/projected/b5975b3d-809b-40df-a28a-5831c6090edc-kube-api-access-7l4w2\") pod \"b5975b3d-809b-40df-a28a-5831c6090edc\" (UID: \"b5975b3d-809b-40df-a28a-5831c6090edc\") " Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.846278 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5975b3d-809b-40df-a28a-5831c6090edc-config-data\") pod \"b5975b3d-809b-40df-a28a-5831c6090edc\" (UID: \"b5975b3d-809b-40df-a28a-5831c6090edc\") " Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.846319 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5975b3d-809b-40df-a28a-5831c6090edc-combined-ca-bundle\") pod \"b5975b3d-809b-40df-a28a-5831c6090edc\" (UID: \"b5975b3d-809b-40df-a28a-5831c6090edc\") " Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.856593 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5975b3d-809b-40df-a28a-5831c6090edc-kube-api-access-7l4w2" (OuterVolumeSpecName: "kube-api-access-7l4w2") pod "b5975b3d-809b-40df-a28a-5831c6090edc" (UID: "b5975b3d-809b-40df-a28a-5831c6090edc"). InnerVolumeSpecName "kube-api-access-7l4w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.878255 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5975b3d-809b-40df-a28a-5831c6090edc-config-data" (OuterVolumeSpecName: "config-data") pod "b5975b3d-809b-40df-a28a-5831c6090edc" (UID: "b5975b3d-809b-40df-a28a-5831c6090edc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.883877 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5975b3d-809b-40df-a28a-5831c6090edc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5975b3d-809b-40df-a28a-5831c6090edc" (UID: "b5975b3d-809b-40df-a28a-5831c6090edc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.897485 4931 generic.go:334] "Generic (PLEG): container finished" podID="b5975b3d-809b-40df-a28a-5831c6090edc" containerID="6fac9f44d434c5b69a527026c9917d00afdb90905555e72c5abc7e9ebc007e0d" exitCode=0 Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.897528 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b5975b3d-809b-40df-a28a-5831c6090edc","Type":"ContainerDied","Data":"6fac9f44d434c5b69a527026c9917d00afdb90905555e72c5abc7e9ebc007e0d"} Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.897550 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.897575 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b5975b3d-809b-40df-a28a-5831c6090edc","Type":"ContainerDied","Data":"2dde04284285c641415fa9ef5b209596418a7ae70faf4976204ae24237323c5b"} Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.897595 4931 scope.go:117] "RemoveContainer" containerID="6fac9f44d434c5b69a527026c9917d00afdb90905555e72c5abc7e9ebc007e0d" Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.949121 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5975b3d-809b-40df-a28a-5831c6090edc-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.949159 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5975b3d-809b-40df-a28a-5831c6090edc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.949177 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l4w2\" (UniqueName: \"kubernetes.io/projected/b5975b3d-809b-40df-a28a-5831c6090edc-kube-api-access-7l4w2\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.969428 4931 scope.go:117] "RemoveContainer" containerID="6fac9f44d434c5b69a527026c9917d00afdb90905555e72c5abc7e9ebc007e0d" Dec 01 15:23:15 crc kubenswrapper[4931]: E1201 15:23:15.969900 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fac9f44d434c5b69a527026c9917d00afdb90905555e72c5abc7e9ebc007e0d\": container with ID starting with 6fac9f44d434c5b69a527026c9917d00afdb90905555e72c5abc7e9ebc007e0d not found: ID does not exist" containerID="6fac9f44d434c5b69a527026c9917d00afdb90905555e72c5abc7e9ebc007e0d" Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.969933 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fac9f44d434c5b69a527026c9917d00afdb90905555e72c5abc7e9ebc007e0d"} err="failed to get container status \"6fac9f44d434c5b69a527026c9917d00afdb90905555e72c5abc7e9ebc007e0d\": rpc error: code = NotFound desc = could not find container \"6fac9f44d434c5b69a527026c9917d00afdb90905555e72c5abc7e9ebc007e0d\": container with ID starting with 6fac9f44d434c5b69a527026c9917d00afdb90905555e72c5abc7e9ebc007e0d not found: ID does not exist" Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.980235 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:23:15 crc kubenswrapper[4931]: I1201 15:23:15.992281 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.000711 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:23:16 crc kubenswrapper[4931]: E1201 15:23:16.001110 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da4dc84-4773-4ff6-878d-e042e108cb65" containerName="nova-manage" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.001128 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da4dc84-4773-4ff6-878d-e042e108cb65" containerName="nova-manage" Dec 01 15:23:16 crc kubenswrapper[4931]: E1201 15:23:16.001140 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" containerName="init" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.001148 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" containerName="init" Dec 01 15:23:16 crc kubenswrapper[4931]: E1201 15:23:16.001179 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5975b3d-809b-40df-a28a-5831c6090edc" containerName="nova-scheduler-scheduler" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.001185 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5975b3d-809b-40df-a28a-5831c6090edc" containerName="nova-scheduler-scheduler" Dec 01 15:23:16 crc kubenswrapper[4931]: E1201 15:23:16.001201 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" containerName="dnsmasq-dns" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.001207 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" containerName="dnsmasq-dns" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.001370 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c20a0b0-6763-49c4-a384-ce95d4c1e6cc" containerName="dnsmasq-dns" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.001421 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da4dc84-4773-4ff6-878d-e042e108cb65" containerName="nova-manage" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.001433 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5975b3d-809b-40df-a28a-5831c6090edc" containerName="nova-scheduler-scheduler" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.002033 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.004440 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.012785 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.156231 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951e2313-b009-42b4-8d52-20a4e3ad6dbf-config-data\") pod \"nova-scheduler-0\" (UID: \"951e2313-b009-42b4-8d52-20a4e3ad6dbf\") " pod="openstack/nova-scheduler-0" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.156688 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gnp9\" (UniqueName: \"kubernetes.io/projected/951e2313-b009-42b4-8d52-20a4e3ad6dbf-kube-api-access-7gnp9\") pod \"nova-scheduler-0\" (UID: \"951e2313-b009-42b4-8d52-20a4e3ad6dbf\") " pod="openstack/nova-scheduler-0" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.156815 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951e2313-b009-42b4-8d52-20a4e3ad6dbf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"951e2313-b009-42b4-8d52-20a4e3ad6dbf\") " pod="openstack/nova-scheduler-0" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.253851 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5975b3d-809b-40df-a28a-5831c6090edc" path="/var/lib/kubelet/pods/b5975b3d-809b-40df-a28a-5831c6090edc/volumes" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.258647 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gnp9\" (UniqueName: \"kubernetes.io/projected/951e2313-b009-42b4-8d52-20a4e3ad6dbf-kube-api-access-7gnp9\") pod \"nova-scheduler-0\" (UID: \"951e2313-b009-42b4-8d52-20a4e3ad6dbf\") " pod="openstack/nova-scheduler-0" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.258775 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951e2313-b009-42b4-8d52-20a4e3ad6dbf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"951e2313-b009-42b4-8d52-20a4e3ad6dbf\") " pod="openstack/nova-scheduler-0" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.258822 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951e2313-b009-42b4-8d52-20a4e3ad6dbf-config-data\") pod \"nova-scheduler-0\" (UID: \"951e2313-b009-42b4-8d52-20a4e3ad6dbf\") " pod="openstack/nova-scheduler-0" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.264564 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/951e2313-b009-42b4-8d52-20a4e3ad6dbf-config-data\") pod \"nova-scheduler-0\" (UID: \"951e2313-b009-42b4-8d52-20a4e3ad6dbf\") " pod="openstack/nova-scheduler-0" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.264891 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/951e2313-b009-42b4-8d52-20a4e3ad6dbf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"951e2313-b009-42b4-8d52-20a4e3ad6dbf\") " pod="openstack/nova-scheduler-0" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.276402 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gnp9\" (UniqueName: \"kubernetes.io/projected/951e2313-b009-42b4-8d52-20a4e3ad6dbf-kube-api-access-7gnp9\") pod \"nova-scheduler-0\" (UID: \"951e2313-b009-42b4-8d52-20a4e3ad6dbf\") " pod="openstack/nova-scheduler-0" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.328978 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.814133 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 15:23:16 crc kubenswrapper[4931]: W1201 15:23:16.815744 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod951e2313_b009_42b4_8d52_20a4e3ad6dbf.slice/crio-28ca6afaee24ee69b252c650117cdd92e358eeb64dff26d5e854f62a838cf527 WatchSource:0}: Error finding container 28ca6afaee24ee69b252c650117cdd92e358eeb64dff26d5e854f62a838cf527: Status 404 returned error can't find the container with id 28ca6afaee24ee69b252c650117cdd92e358eeb64dff26d5e854f62a838cf527 Dec 01 15:23:16 crc kubenswrapper[4931]: I1201 15:23:16.919847 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"951e2313-b009-42b4-8d52-20a4e3ad6dbf","Type":"ContainerStarted","Data":"28ca6afaee24ee69b252c650117cdd92e358eeb64dff26d5e854f62a838cf527"} Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.279693 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="59913d10-be6d-4c01-850a-4da0139c28cf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": read tcp 10.217.0.2:53996->10.217.0.189:8775: read: connection reset by peer" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.279714 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="59913d10-be6d-4c01-850a-4da0139c28cf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": read tcp 10.217.0.2:54012->10.217.0.189:8775: read: connection reset by peer" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.712569 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.784576 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59913d10-be6d-4c01-850a-4da0139c28cf-logs\") pod \"59913d10-be6d-4c01-850a-4da0139c28cf\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.785233 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59913d10-be6d-4c01-850a-4da0139c28cf-logs" (OuterVolumeSpecName: "logs") pod "59913d10-be6d-4c01-850a-4da0139c28cf" (UID: "59913d10-be6d-4c01-850a-4da0139c28cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.785293 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-nova-metadata-tls-certs\") pod \"59913d10-be6d-4c01-850a-4da0139c28cf\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.785377 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-combined-ca-bundle\") pod \"59913d10-be6d-4c01-850a-4da0139c28cf\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.785545 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-config-data\") pod \"59913d10-be6d-4c01-850a-4da0139c28cf\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.785584 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmt5r\" (UniqueName: \"kubernetes.io/projected/59913d10-be6d-4c01-850a-4da0139c28cf-kube-api-access-fmt5r\") pod \"59913d10-be6d-4c01-850a-4da0139c28cf\" (UID: \"59913d10-be6d-4c01-850a-4da0139c28cf\") " Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.786288 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59913d10-be6d-4c01-850a-4da0139c28cf-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.789723 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59913d10-be6d-4c01-850a-4da0139c28cf-kube-api-access-fmt5r" (OuterVolumeSpecName: "kube-api-access-fmt5r") pod "59913d10-be6d-4c01-850a-4da0139c28cf" (UID: "59913d10-be6d-4c01-850a-4da0139c28cf"). InnerVolumeSpecName "kube-api-access-fmt5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.821539 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-config-data" (OuterVolumeSpecName: "config-data") pod "59913d10-be6d-4c01-850a-4da0139c28cf" (UID: "59913d10-be6d-4c01-850a-4da0139c28cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.830280 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59913d10-be6d-4c01-850a-4da0139c28cf" (UID: "59913d10-be6d-4c01-850a-4da0139c28cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.853642 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "59913d10-be6d-4c01-850a-4da0139c28cf" (UID: "59913d10-be6d-4c01-850a-4da0139c28cf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.888268 4931 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.888309 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.888323 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59913d10-be6d-4c01-850a-4da0139c28cf-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.888336 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmt5r\" (UniqueName: \"kubernetes.io/projected/59913d10-be6d-4c01-850a-4da0139c28cf-kube-api-access-fmt5r\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.942164 4931 generic.go:334] "Generic (PLEG): container finished" podID="59913d10-be6d-4c01-850a-4da0139c28cf" containerID="4be87e29eb7b0e2f1b334ce85d967c67215c743f395941abc648cff9bae0359f" exitCode=0 Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.942269 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59913d10-be6d-4c01-850a-4da0139c28cf","Type":"ContainerDied","Data":"4be87e29eb7b0e2f1b334ce85d967c67215c743f395941abc648cff9bae0359f"} Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.942302 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59913d10-be6d-4c01-850a-4da0139c28cf","Type":"ContainerDied","Data":"0a7a2a3c85c4d41538f7fa976380f63b6ff59965cbe8a2efe03317265fbeaffc"} Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.942342 4931 scope.go:117] "RemoveContainer" containerID="4be87e29eb7b0e2f1b334ce85d967c67215c743f395941abc648cff9bae0359f" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.942516 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.957643 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"951e2313-b009-42b4-8d52-20a4e3ad6dbf","Type":"ContainerStarted","Data":"e69bf28df2c11fd9b6759e735f99432d1362d44ac778cd86a0ffc016b9f0c076"} Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.980631 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.980613035 podStartE2EDuration="2.980613035s" podCreationTimestamp="2025-12-01 15:23:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:23:17.97795346 +0000 UTC m=+1344.403827127" watchObservedRunningTime="2025-12-01 15:23:17.980613035 +0000 UTC m=+1344.406486702" Dec 01 15:23:17 crc kubenswrapper[4931]: I1201 15:23:17.989440 4931 scope.go:117] "RemoveContainer" containerID="684a6febf451423daf0e97a247ad25c015136ffbaf95f97f4460502b04168ad6" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.002766 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.012321 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.026412 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:23:18 crc kubenswrapper[4931]: E1201 15:23:18.026831 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59913d10-be6d-4c01-850a-4da0139c28cf" containerName="nova-metadata-log" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.026848 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="59913d10-be6d-4c01-850a-4da0139c28cf" containerName="nova-metadata-log" Dec 01 15:23:18 crc kubenswrapper[4931]: E1201 15:23:18.026863 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59913d10-be6d-4c01-850a-4da0139c28cf" containerName="nova-metadata-metadata" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.026871 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="59913d10-be6d-4c01-850a-4da0139c28cf" containerName="nova-metadata-metadata" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.027094 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="59913d10-be6d-4c01-850a-4da0139c28cf" containerName="nova-metadata-log" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.027160 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="59913d10-be6d-4c01-850a-4da0139c28cf" containerName="nova-metadata-metadata" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.028119 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.033646 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.033674 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.036883 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.056557 4931 scope.go:117] "RemoveContainer" containerID="4be87e29eb7b0e2f1b334ce85d967c67215c743f395941abc648cff9bae0359f" Dec 01 15:23:18 crc kubenswrapper[4931]: E1201 15:23:18.061420 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be87e29eb7b0e2f1b334ce85d967c67215c743f395941abc648cff9bae0359f\": container with ID starting with 4be87e29eb7b0e2f1b334ce85d967c67215c743f395941abc648cff9bae0359f not found: ID does not exist" containerID="4be87e29eb7b0e2f1b334ce85d967c67215c743f395941abc648cff9bae0359f" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.061464 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be87e29eb7b0e2f1b334ce85d967c67215c743f395941abc648cff9bae0359f"} err="failed to get container status \"4be87e29eb7b0e2f1b334ce85d967c67215c743f395941abc648cff9bae0359f\": rpc error: code = NotFound desc = could not find container \"4be87e29eb7b0e2f1b334ce85d967c67215c743f395941abc648cff9bae0359f\": container with ID starting with 4be87e29eb7b0e2f1b334ce85d967c67215c743f395941abc648cff9bae0359f not found: ID does not exist" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.061505 4931 scope.go:117] "RemoveContainer" containerID="684a6febf451423daf0e97a247ad25c015136ffbaf95f97f4460502b04168ad6" Dec 01 15:23:18 crc kubenswrapper[4931]: E1201 15:23:18.061824 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"684a6febf451423daf0e97a247ad25c015136ffbaf95f97f4460502b04168ad6\": container with ID starting with 684a6febf451423daf0e97a247ad25c015136ffbaf95f97f4460502b04168ad6 not found: ID does not exist" containerID="684a6febf451423daf0e97a247ad25c015136ffbaf95f97f4460502b04168ad6" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.061849 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684a6febf451423daf0e97a247ad25c015136ffbaf95f97f4460502b04168ad6"} err="failed to get container status \"684a6febf451423daf0e97a247ad25c015136ffbaf95f97f4460502b04168ad6\": rpc error: code = NotFound desc = could not find container \"684a6febf451423daf0e97a247ad25c015136ffbaf95f97f4460502b04168ad6\": container with ID starting with 684a6febf451423daf0e97a247ad25c015136ffbaf95f97f4460502b04168ad6 not found: ID does not exist" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.091651 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.091739 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2c9p\" (UniqueName: \"kubernetes.io/projected/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-kube-api-access-l2c9p\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.091990 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-logs\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.092259 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.092376 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-config-data\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.194544 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.194615 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2c9p\" (UniqueName: \"kubernetes.io/projected/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-kube-api-access-l2c9p\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.194643 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-logs\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.194689 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.194739 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-config-data\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.195643 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-logs\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.199776 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.203776 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-config-data\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.204101 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.215502 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2c9p\" (UniqueName: \"kubernetes.io/projected/5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db-kube-api-access-l2c9p\") pod \"nova-metadata-0\" (UID: \"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db\") " pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.253584 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59913d10-be6d-4c01-850a-4da0139c28cf" path="/var/lib/kubelet/pods/59913d10-be6d-4c01-850a-4da0139c28cf/volumes" Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.354809 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 15:23:18 crc kubenswrapper[4931]: W1201 15:23:18.859917 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5493bd5f_c1b6_469e_b0e3_6fe77c4ec2db.slice/crio-bf3e437638dc534391f998016640bc53a1728820b849830ac29a400d2774f507 WatchSource:0}: Error finding container bf3e437638dc534391f998016640bc53a1728820b849830ac29a400d2774f507: Status 404 returned error can't find the container with id bf3e437638dc534391f998016640bc53a1728820b849830ac29a400d2774f507 Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.863112 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 15:23:18 crc kubenswrapper[4931]: I1201 15:23:18.970351 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db","Type":"ContainerStarted","Data":"bf3e437638dc534391f998016640bc53a1728820b849830ac29a400d2774f507"} Dec 01 15:23:19 crc kubenswrapper[4931]: I1201 15:23:19.963922 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:23:19 crc kubenswrapper[4931]: I1201 15:23:19.979570 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db","Type":"ContainerStarted","Data":"0f016191e7500c17e05a60d0cbbd470e9e596de236cd1f748e07b1ef473ed77c"} Dec 01 15:23:19 crc kubenswrapper[4931]: I1201 15:23:19.979632 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db","Type":"ContainerStarted","Data":"6ccabce41fb4506972ee157b83ff2226e1514a337905e7be0c878ae74c20c648"} Dec 01 15:23:19 crc kubenswrapper[4931]: I1201 15:23:19.981847 4931 generic.go:334] "Generic (PLEG): container finished" podID="fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" containerID="7568bdc3c5c656ea08c913302c3130584d4c27e5f1c09043777c7024adac4fbe" exitCode=0 Dec 01 15:23:19 crc kubenswrapper[4931]: I1201 15:23:19.981884 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351","Type":"ContainerDied","Data":"7568bdc3c5c656ea08c913302c3130584d4c27e5f1c09043777c7024adac4fbe"} Dec 01 15:23:19 crc kubenswrapper[4931]: I1201 15:23:19.981906 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351","Type":"ContainerDied","Data":"682898e0e97566e2468a77c753a02781e3adbd65503e76ad52c9474560551273"} Dec 01 15:23:19 crc kubenswrapper[4931]: I1201 15:23:19.981925 4931 scope.go:117] "RemoveContainer" containerID="7568bdc3c5c656ea08c913302c3130584d4c27e5f1c09043777c7024adac4fbe" Dec 01 15:23:19 crc kubenswrapper[4931]: I1201 15:23:19.982022 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.007119 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.007093998 podStartE2EDuration="2.007093998s" podCreationTimestamp="2025-12-01 15:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:23:20.00286583 +0000 UTC m=+1346.428739517" watchObservedRunningTime="2025-12-01 15:23:20.007093998 +0000 UTC m=+1346.432967665" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.018395 4931 scope.go:117] "RemoveContainer" containerID="78d0a1238992c635737d59615b4cc37e4417ed3d554e3adb2cacdec3369f40c5" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.049520 4931 scope.go:117] "RemoveContainer" containerID="7568bdc3c5c656ea08c913302c3130584d4c27e5f1c09043777c7024adac4fbe" Dec 01 15:23:20 crc kubenswrapper[4931]: E1201 15:23:20.049985 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7568bdc3c5c656ea08c913302c3130584d4c27e5f1c09043777c7024adac4fbe\": container with ID starting with 7568bdc3c5c656ea08c913302c3130584d4c27e5f1c09043777c7024adac4fbe not found: ID does not exist" containerID="7568bdc3c5c656ea08c913302c3130584d4c27e5f1c09043777c7024adac4fbe" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.050046 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7568bdc3c5c656ea08c913302c3130584d4c27e5f1c09043777c7024adac4fbe"} err="failed to get container status \"7568bdc3c5c656ea08c913302c3130584d4c27e5f1c09043777c7024adac4fbe\": rpc error: code = NotFound desc = could not find container \"7568bdc3c5c656ea08c913302c3130584d4c27e5f1c09043777c7024adac4fbe\": container with ID starting with 7568bdc3c5c656ea08c913302c3130584d4c27e5f1c09043777c7024adac4fbe not found: ID does not exist" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.050091 4931 scope.go:117] "RemoveContainer" containerID="78d0a1238992c635737d59615b4cc37e4417ed3d554e3adb2cacdec3369f40c5" Dec 01 15:23:20 crc kubenswrapper[4931]: E1201 15:23:20.050457 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d0a1238992c635737d59615b4cc37e4417ed3d554e3adb2cacdec3369f40c5\": container with ID starting with 78d0a1238992c635737d59615b4cc37e4417ed3d554e3adb2cacdec3369f40c5 not found: ID does not exist" containerID="78d0a1238992c635737d59615b4cc37e4417ed3d554e3adb2cacdec3369f40c5" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.050498 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d0a1238992c635737d59615b4cc37e4417ed3d554e3adb2cacdec3369f40c5"} err="failed to get container status \"78d0a1238992c635737d59615b4cc37e4417ed3d554e3adb2cacdec3369f40c5\": rpc error: code = NotFound desc = could not find container \"78d0a1238992c635737d59615b4cc37e4417ed3d554e3adb2cacdec3369f40c5\": container with ID starting with 78d0a1238992c635737d59615b4cc37e4417ed3d554e3adb2cacdec3369f40c5 not found: ID does not exist" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.057206 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-config-data\") pod \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.057268 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-logs\") pod \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.057442 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9snk2\" (UniqueName: \"kubernetes.io/projected/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-kube-api-access-9snk2\") pod \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.057483 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-public-tls-certs\") pod \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.057515 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-internal-tls-certs\") pod \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.057569 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-combined-ca-bundle\") pod \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\" (UID: \"fb8e879c-a03d-4c2c-b8e7-bef7caa1f351\") " Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.057921 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-logs" (OuterVolumeSpecName: "logs") pod "fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" (UID: "fb8e879c-a03d-4c2c-b8e7-bef7caa1f351"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.063714 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-kube-api-access-9snk2" (OuterVolumeSpecName: "kube-api-access-9snk2") pod "fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" (UID: "fb8e879c-a03d-4c2c-b8e7-bef7caa1f351"). InnerVolumeSpecName "kube-api-access-9snk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.089115 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-config-data" (OuterVolumeSpecName: "config-data") pod "fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" (UID: "fb8e879c-a03d-4c2c-b8e7-bef7caa1f351"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.091332 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" (UID: "fb8e879c-a03d-4c2c-b8e7-bef7caa1f351"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.118325 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" (UID: "fb8e879c-a03d-4c2c-b8e7-bef7caa1f351"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.122815 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" (UID: "fb8e879c-a03d-4c2c-b8e7-bef7caa1f351"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.160063 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9snk2\" (UniqueName: \"kubernetes.io/projected/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-kube-api-access-9snk2\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.160101 4931 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.160234 4931 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.160360 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.160395 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.160408 4931 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351-logs\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.309791 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.321576 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.332550 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 15:23:20 crc kubenswrapper[4931]: E1201 15:23:20.333026 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" containerName="nova-api-log" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.333049 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" containerName="nova-api-log" Dec 01 15:23:20 crc kubenswrapper[4931]: E1201 15:23:20.333099 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" containerName="nova-api-api" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.333107 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" containerName="nova-api-api" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.333283 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" containerName="nova-api-log" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.333292 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" containerName="nova-api-api" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.334373 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.338162 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.339040 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.342275 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.357279 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.465649 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.465707 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.465775 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrv48\" (UniqueName: \"kubernetes.io/projected/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-kube-api-access-hrv48\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.465871 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-config-data\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.465897 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-public-tls-certs\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.465954 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-logs\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.567121 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-logs\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.567243 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.567277 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.567328 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrv48\" (UniqueName: \"kubernetes.io/projected/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-kube-api-access-hrv48\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.567412 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-config-data\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.567435 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-public-tls-certs\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.569251 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-logs\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.571288 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-public-tls-certs\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.571677 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.572296 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.572801 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-config-data\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.584964 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrv48\" (UniqueName: \"kubernetes.io/projected/2410dbb5-8347-4484-b4dc-6ed9dd32edf7-kube-api-access-hrv48\") pod \"nova-api-0\" (UID: \"2410dbb5-8347-4484-b4dc-6ed9dd32edf7\") " pod="openstack/nova-api-0" Dec 01 15:23:20 crc kubenswrapper[4931]: I1201 15:23:20.669445 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 15:23:21 crc kubenswrapper[4931]: I1201 15:23:21.098171 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 15:23:21 crc kubenswrapper[4931]: W1201 15:23:21.099562 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2410dbb5_8347_4484_b4dc_6ed9dd32edf7.slice/crio-eb4d0b9bf862e69435fae1e63aa4433ec4d72c7d331305d6704d9a73db7cb11f WatchSource:0}: Error finding container eb4d0b9bf862e69435fae1e63aa4433ec4d72c7d331305d6704d9a73db7cb11f: Status 404 returned error can't find the container with id eb4d0b9bf862e69435fae1e63aa4433ec4d72c7d331305d6704d9a73db7cb11f Dec 01 15:23:21 crc kubenswrapper[4931]: I1201 15:23:21.330108 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 15:23:22 crc kubenswrapper[4931]: I1201 15:23:22.001753 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2410dbb5-8347-4484-b4dc-6ed9dd32edf7","Type":"ContainerStarted","Data":"c4ce319714bfd1efbbe21d8ce359f601e2bfb96f7fa42753d10513dce524cc81"} Dec 01 15:23:22 crc kubenswrapper[4931]: I1201 15:23:22.001812 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2410dbb5-8347-4484-b4dc-6ed9dd32edf7","Type":"ContainerStarted","Data":"9dce130ccad5ace18e46eaa86fff04be3b7f96209520f688c31afd8748818f77"} Dec 01 15:23:22 crc kubenswrapper[4931]: I1201 15:23:22.001831 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2410dbb5-8347-4484-b4dc-6ed9dd32edf7","Type":"ContainerStarted","Data":"eb4d0b9bf862e69435fae1e63aa4433ec4d72c7d331305d6704d9a73db7cb11f"} Dec 01 15:23:22 crc kubenswrapper[4931]: I1201 15:23:22.268108 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb8e879c-a03d-4c2c-b8e7-bef7caa1f351" path="/var/lib/kubelet/pods/fb8e879c-a03d-4c2c-b8e7-bef7caa1f351/volumes" Dec 01 15:23:23 crc kubenswrapper[4931]: I1201 15:23:23.356600 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 15:23:23 crc kubenswrapper[4931]: I1201 15:23:23.356758 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 15:23:26 crc kubenswrapper[4931]: I1201 15:23:26.329539 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 15:23:26 crc kubenswrapper[4931]: I1201 15:23:26.356086 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 15:23:26 crc kubenswrapper[4931]: I1201 15:23:26.386835 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=6.3868193810000005 podStartE2EDuration="6.386819381s" podCreationTimestamp="2025-12-01 15:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:23:22.019991582 +0000 UTC m=+1348.445865249" watchObservedRunningTime="2025-12-01 15:23:26.386819381 +0000 UTC m=+1352.812693038" Dec 01 15:23:27 crc kubenswrapper[4931]: I1201 15:23:27.079411 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 15:23:28 crc kubenswrapper[4931]: I1201 15:23:28.356008 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 15:23:28 crc kubenswrapper[4931]: I1201 15:23:28.356063 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 15:23:29 crc kubenswrapper[4931]: I1201 15:23:29.370533 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 15:23:29 crc kubenswrapper[4931]: I1201 15:23:29.370703 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 15:23:30 crc kubenswrapper[4931]: I1201 15:23:30.670744 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 15:23:30 crc kubenswrapper[4931]: I1201 15:23:30.670828 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 15:23:31 crc kubenswrapper[4931]: I1201 15:23:31.684606 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2410dbb5-8347-4484-b4dc-6ed9dd32edf7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 15:23:31 crc kubenswrapper[4931]: I1201 15:23:31.685268 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2410dbb5-8347-4484-b4dc-6ed9dd32edf7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 15:23:34 crc kubenswrapper[4931]: I1201 15:23:34.454124 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 15:23:38 crc kubenswrapper[4931]: I1201 15:23:38.139512 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 15:23:38 crc kubenswrapper[4931]: I1201 15:23:38.140354 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5a527180-b5fc-47ab-b4e6-24aa23ae703a" containerName="kube-state-metrics" containerID="cri-o://851bebbdf9434cc7a0389a8bc6ba04a0717f6cc8f039f733d2e7af4bff28bb03" gracePeriod=30 Dec 01 15:23:38 crc kubenswrapper[4931]: I1201 15:23:38.362838 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 15:23:38 crc kubenswrapper[4931]: I1201 15:23:38.364991 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 15:23:38 crc kubenswrapper[4931]: I1201 15:23:38.371896 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 15:23:39 crc kubenswrapper[4931]: I1201 15:23:39.149948 4931 generic.go:334] "Generic (PLEG): container finished" podID="5a527180-b5fc-47ab-b4e6-24aa23ae703a" containerID="851bebbdf9434cc7a0389a8bc6ba04a0717f6cc8f039f733d2e7af4bff28bb03" exitCode=2 Dec 01 15:23:39 crc kubenswrapper[4931]: I1201 15:23:39.150523 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a527180-b5fc-47ab-b4e6-24aa23ae703a","Type":"ContainerDied","Data":"851bebbdf9434cc7a0389a8bc6ba04a0717f6cc8f039f733d2e7af4bff28bb03"} Dec 01 15:23:39 crc kubenswrapper[4931]: I1201 15:23:39.159223 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 15:23:39 crc kubenswrapper[4931]: I1201 15:23:39.401722 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 15:23:39 crc kubenswrapper[4931]: I1201 15:23:39.503729 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d2gh\" (UniqueName: \"kubernetes.io/projected/5a527180-b5fc-47ab-b4e6-24aa23ae703a-kube-api-access-2d2gh\") pod \"5a527180-b5fc-47ab-b4e6-24aa23ae703a\" (UID: \"5a527180-b5fc-47ab-b4e6-24aa23ae703a\") " Dec 01 15:23:39 crc kubenswrapper[4931]: I1201 15:23:39.508481 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a527180-b5fc-47ab-b4e6-24aa23ae703a-kube-api-access-2d2gh" (OuterVolumeSpecName: "kube-api-access-2d2gh") pod "5a527180-b5fc-47ab-b4e6-24aa23ae703a" (UID: "5a527180-b5fc-47ab-b4e6-24aa23ae703a"). InnerVolumeSpecName "kube-api-access-2d2gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:23:39 crc kubenswrapper[4931]: I1201 15:23:39.606366 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d2gh\" (UniqueName: \"kubernetes.io/projected/5a527180-b5fc-47ab-b4e6-24aa23ae703a-kube-api-access-2d2gh\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:39 crc kubenswrapper[4931]: I1201 15:23:39.775213 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:23:39 crc kubenswrapper[4931]: I1201 15:23:39.775469 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="ceilometer-central-agent" containerID="cri-o://fc3576d96c401d3c6ca1b16f28423ab1c9a3427f74238dcd88f74a17b5011404" gracePeriod=30 Dec 01 15:23:39 crc kubenswrapper[4931]: I1201 15:23:39.775573 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="ceilometer-notification-agent" containerID="cri-o://5896f2992f743546cd7bbbc5e825ac5373a1d513237a46a63b7d0f0352f25088" gracePeriod=30 Dec 01 15:23:39 crc kubenswrapper[4931]: I1201 15:23:39.775639 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="proxy-httpd" containerID="cri-o://d9a5a01e7c3141be3007b4db84739a26170feab8b80d58a005f4e9d5e18d7728" gracePeriod=30 Dec 01 15:23:39 crc kubenswrapper[4931]: I1201 15:23:39.775755 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="sg-core" containerID="cri-o://3a8d5c86a19828e97274c967b246642fbe0f2bab6277e0aee0736bfad8898a20" gracePeriod=30 Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.189784 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a527180-b5fc-47ab-b4e6-24aa23ae703a","Type":"ContainerDied","Data":"1c554758ebab44c907e6bdb2febf8b2b0a1e1c96e9345f9c83854edd9d36944d"} Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.189800 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.190149 4931 scope.go:117] "RemoveContainer" containerID="851bebbdf9434cc7a0389a8bc6ba04a0717f6cc8f039f733d2e7af4bff28bb03" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.227863 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.237864 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.254323 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a527180-b5fc-47ab-b4e6-24aa23ae703a" path="/var/lib/kubelet/pods/5a527180-b5fc-47ab-b4e6-24aa23ae703a/volumes" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.257962 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 15:23:40 crc kubenswrapper[4931]: E1201 15:23:40.258326 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a527180-b5fc-47ab-b4e6-24aa23ae703a" containerName="kube-state-metrics" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.258345 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a527180-b5fc-47ab-b4e6-24aa23ae703a" containerName="kube-state-metrics" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.258531 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a527180-b5fc-47ab-b4e6-24aa23ae703a" containerName="kube-state-metrics" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.259155 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.260808 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.261267 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.267686 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.368151 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7\") " pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.368307 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7\") " pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.368368 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws4fp\" (UniqueName: \"kubernetes.io/projected/8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7-kube-api-access-ws4fp\") pod \"kube-state-metrics-0\" (UID: \"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7\") " pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.368456 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7\") " pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.470091 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7\") " pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.470266 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7\") " pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.470309 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws4fp\" (UniqueName: \"kubernetes.io/projected/8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7-kube-api-access-ws4fp\") pod \"kube-state-metrics-0\" (UID: \"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7\") " pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.470472 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7\") " pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.475260 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7\") " pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.476692 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7\") " pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.478064 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7\") " pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.494033 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws4fp\" (UniqueName: \"kubernetes.io/projected/8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7-kube-api-access-ws4fp\") pod \"kube-state-metrics-0\" (UID: \"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7\") " pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.582927 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.704867 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.705452 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.711031 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 15:23:40 crc kubenswrapper[4931]: I1201 15:23:40.713125 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 15:23:41 crc kubenswrapper[4931]: I1201 15:23:41.107720 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 15:23:41 crc kubenswrapper[4931]: I1201 15:23:41.205138 4931 generic.go:334] "Generic (PLEG): container finished" podID="25133581-9318-4292-acb4-38574d51628a" containerID="3a8d5c86a19828e97274c967b246642fbe0f2bab6277e0aee0736bfad8898a20" exitCode=2 Dec 01 15:23:41 crc kubenswrapper[4931]: I1201 15:23:41.205216 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25133581-9318-4292-acb4-38574d51628a","Type":"ContainerDied","Data":"3a8d5c86a19828e97274c967b246642fbe0f2bab6277e0aee0736bfad8898a20"} Dec 01 15:23:41 crc kubenswrapper[4931]: I1201 15:23:41.208196 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7","Type":"ContainerStarted","Data":"cd7f498ce86088f9b76e03923dbc55afbb8f3154e9025793521906f76e8bc84b"} Dec 01 15:23:41 crc kubenswrapper[4931]: I1201 15:23:41.215314 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 15:23:41 crc kubenswrapper[4931]: I1201 15:23:41.233347 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 15:23:42 crc kubenswrapper[4931]: I1201 15:23:42.221713 4931 generic.go:334] "Generic (PLEG): container finished" podID="25133581-9318-4292-acb4-38574d51628a" containerID="d9a5a01e7c3141be3007b4db84739a26170feab8b80d58a005f4e9d5e18d7728" exitCode=0 Dec 01 15:23:42 crc kubenswrapper[4931]: I1201 15:23:42.222029 4931 generic.go:334] "Generic (PLEG): container finished" podID="25133581-9318-4292-acb4-38574d51628a" containerID="fc3576d96c401d3c6ca1b16f28423ab1c9a3427f74238dcd88f74a17b5011404" exitCode=0 Dec 01 15:23:42 crc kubenswrapper[4931]: I1201 15:23:42.221776 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25133581-9318-4292-acb4-38574d51628a","Type":"ContainerDied","Data":"d9a5a01e7c3141be3007b4db84739a26170feab8b80d58a005f4e9d5e18d7728"} Dec 01 15:23:42 crc kubenswrapper[4931]: I1201 15:23:42.222134 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25133581-9318-4292-acb4-38574d51628a","Type":"ContainerDied","Data":"fc3576d96c401d3c6ca1b16f28423ab1c9a3427f74238dcd88f74a17b5011404"} Dec 01 15:23:43 crc kubenswrapper[4931]: I1201 15:23:43.238820 4931 generic.go:334] "Generic (PLEG): container finished" podID="25133581-9318-4292-acb4-38574d51628a" containerID="5896f2992f743546cd7bbbc5e825ac5373a1d513237a46a63b7d0f0352f25088" exitCode=0 Dec 01 15:23:43 crc kubenswrapper[4931]: I1201 15:23:43.239113 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25133581-9318-4292-acb4-38574d51628a","Type":"ContainerDied","Data":"5896f2992f743546cd7bbbc5e825ac5373a1d513237a46a63b7d0f0352f25088"} Dec 01 15:23:43 crc kubenswrapper[4931]: I1201 15:23:43.244700 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7","Type":"ContainerStarted","Data":"a77cc61288fe4f67a7e68c18d6ad6d16488c76f047ac9e536e418400329984f7"} Dec 01 15:23:43 crc kubenswrapper[4931]: I1201 15:23:43.244924 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 15:23:43 crc kubenswrapper[4931]: I1201 15:23:43.268979 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.50192581 podStartE2EDuration="3.268952847s" podCreationTimestamp="2025-12-01 15:23:40 +0000 UTC" firstStartedPulling="2025-12-01 15:23:41.115085863 +0000 UTC m=+1367.540959530" lastFinishedPulling="2025-12-01 15:23:42.8821129 +0000 UTC m=+1369.307986567" observedRunningTime="2025-12-01 15:23:43.262085532 +0000 UTC m=+1369.687959199" watchObservedRunningTime="2025-12-01 15:23:43.268952847 +0000 UTC m=+1369.694826514" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.041519 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.235842 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-config-data\") pod \"25133581-9318-4292-acb4-38574d51628a\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.235885 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-sg-core-conf-yaml\") pod \"25133581-9318-4292-acb4-38574d51628a\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.235956 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25133581-9318-4292-acb4-38574d51628a-run-httpd\") pod \"25133581-9318-4292-acb4-38574d51628a\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.236048 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25133581-9318-4292-acb4-38574d51628a-log-httpd\") pod \"25133581-9318-4292-acb4-38574d51628a\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.236093 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvlrl\" (UniqueName: \"kubernetes.io/projected/25133581-9318-4292-acb4-38574d51628a-kube-api-access-mvlrl\") pod \"25133581-9318-4292-acb4-38574d51628a\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.236126 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-combined-ca-bundle\") pod \"25133581-9318-4292-acb4-38574d51628a\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.236162 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-scripts\") pod \"25133581-9318-4292-acb4-38574d51628a\" (UID: \"25133581-9318-4292-acb4-38574d51628a\") " Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.236472 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25133581-9318-4292-acb4-38574d51628a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "25133581-9318-4292-acb4-38574d51628a" (UID: "25133581-9318-4292-acb4-38574d51628a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.236665 4931 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25133581-9318-4292-acb4-38574d51628a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.237416 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25133581-9318-4292-acb4-38574d51628a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "25133581-9318-4292-acb4-38574d51628a" (UID: "25133581-9318-4292-acb4-38574d51628a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.260548 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-scripts" (OuterVolumeSpecName: "scripts") pod "25133581-9318-4292-acb4-38574d51628a" (UID: "25133581-9318-4292-acb4-38574d51628a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.330851 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25133581-9318-4292-acb4-38574d51628a-kube-api-access-mvlrl" (OuterVolumeSpecName: "kube-api-access-mvlrl") pod "25133581-9318-4292-acb4-38574d51628a" (UID: "25133581-9318-4292-acb4-38574d51628a"). InnerVolumeSpecName "kube-api-access-mvlrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.338098 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvlrl\" (UniqueName: \"kubernetes.io/projected/25133581-9318-4292-acb4-38574d51628a-kube-api-access-mvlrl\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.338138 4931 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.338154 4931 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25133581-9318-4292-acb4-38574d51628a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.341461 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "25133581-9318-4292-acb4-38574d51628a" (UID: "25133581-9318-4292-acb4-38574d51628a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.374043 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.402581 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25133581-9318-4292-acb4-38574d51628a","Type":"ContainerDied","Data":"e5551b865533a7f621e75ccf393ed7b8bcfca358c07f8ba36f1ecb16db76ec82"} Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.402633 4931 scope.go:117] "RemoveContainer" containerID="d9a5a01e7c3141be3007b4db84739a26170feab8b80d58a005f4e9d5e18d7728" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.419570 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25133581-9318-4292-acb4-38574d51628a" (UID: "25133581-9318-4292-acb4-38574d51628a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.440985 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.441061 4931 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.445074 4931 scope.go:117] "RemoveContainer" containerID="3a8d5c86a19828e97274c967b246642fbe0f2bab6277e0aee0736bfad8898a20" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.456516 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-config-data" (OuterVolumeSpecName: "config-data") pod "25133581-9318-4292-acb4-38574d51628a" (UID: "25133581-9318-4292-acb4-38574d51628a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.474264 4931 scope.go:117] "RemoveContainer" containerID="5896f2992f743546cd7bbbc5e825ac5373a1d513237a46a63b7d0f0352f25088" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.495758 4931 scope.go:117] "RemoveContainer" containerID="fc3576d96c401d3c6ca1b16f28423ab1c9a3427f74238dcd88f74a17b5011404" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.501826 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-shbng"] Dec 01 15:23:44 crc kubenswrapper[4931]: E1201 15:23:44.502180 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="sg-core" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.502196 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="sg-core" Dec 01 15:23:44 crc kubenswrapper[4931]: E1201 15:23:44.502210 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="ceilometer-central-agent" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.502217 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="ceilometer-central-agent" Dec 01 15:23:44 crc kubenswrapper[4931]: E1201 15:23:44.502238 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="ceilometer-notification-agent" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.502245 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="ceilometer-notification-agent" Dec 01 15:23:44 crc kubenswrapper[4931]: E1201 15:23:44.502264 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="proxy-httpd" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.502270 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="proxy-httpd" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.502470 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="sg-core" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.502482 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="ceilometer-notification-agent" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.502494 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="ceilometer-central-agent" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.502502 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="25133581-9318-4292-acb4-38574d51628a" containerName="proxy-httpd" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.503798 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.512604 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-shbng"] Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.542751 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25133581-9318-4292-acb4-38574d51628a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.645967 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abefeaeb-e647-4548-969b-4be700456f44-catalog-content\") pod \"redhat-operators-shbng\" (UID: \"abefeaeb-e647-4548-969b-4be700456f44\") " pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.647052 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abefeaeb-e647-4548-969b-4be700456f44-utilities\") pod \"redhat-operators-shbng\" (UID: \"abefeaeb-e647-4548-969b-4be700456f44\") " pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.647256 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tpdp\" (UniqueName: \"kubernetes.io/projected/abefeaeb-e647-4548-969b-4be700456f44-kube-api-access-6tpdp\") pod \"redhat-operators-shbng\" (UID: \"abefeaeb-e647-4548-969b-4be700456f44\") " pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.708545 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.735631 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.749210 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tpdp\" (UniqueName: \"kubernetes.io/projected/abefeaeb-e647-4548-969b-4be700456f44-kube-api-access-6tpdp\") pod \"redhat-operators-shbng\" (UID: \"abefeaeb-e647-4548-969b-4be700456f44\") " pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.749322 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abefeaeb-e647-4548-969b-4be700456f44-catalog-content\") pod \"redhat-operators-shbng\" (UID: \"abefeaeb-e647-4548-969b-4be700456f44\") " pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.749377 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abefeaeb-e647-4548-969b-4be700456f44-utilities\") pod \"redhat-operators-shbng\" (UID: \"abefeaeb-e647-4548-969b-4be700456f44\") " pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.749820 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abefeaeb-e647-4548-969b-4be700456f44-utilities\") pod \"redhat-operators-shbng\" (UID: \"abefeaeb-e647-4548-969b-4be700456f44\") " pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.749908 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abefeaeb-e647-4548-969b-4be700456f44-catalog-content\") pod \"redhat-operators-shbng\" (UID: \"abefeaeb-e647-4548-969b-4be700456f44\") " pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.752172 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.775275 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.775373 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.777466 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tpdp\" (UniqueName: \"kubernetes.io/projected/abefeaeb-e647-4548-969b-4be700456f44-kube-api-access-6tpdp\") pod \"redhat-operators-shbng\" (UID: \"abefeaeb-e647-4548-969b-4be700456f44\") " pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.779792 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.781784 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.782860 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.823417 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.851301 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.851364 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-config-data\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.851400 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.851472 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.851526 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8e3974-4d05-40e2-9306-01dd34663e53-log-httpd\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.851542 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8e3974-4d05-40e2-9306-01dd34663e53-run-httpd\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.851563 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-scripts\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.851677 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnpck\" (UniqueName: \"kubernetes.io/projected/de8e3974-4d05-40e2-9306-01dd34663e53-kube-api-access-gnpck\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.953907 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.954167 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8e3974-4d05-40e2-9306-01dd34663e53-log-httpd\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.954185 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8e3974-4d05-40e2-9306-01dd34663e53-run-httpd\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.954204 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-scripts\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.954224 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnpck\" (UniqueName: \"kubernetes.io/projected/de8e3974-4d05-40e2-9306-01dd34663e53-kube-api-access-gnpck\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.954287 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.954325 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-config-data\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.954344 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.958918 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8e3974-4d05-40e2-9306-01dd34663e53-log-httpd\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.960993 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8e3974-4d05-40e2-9306-01dd34663e53-run-httpd\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.963537 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.965993 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-scripts\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.968171 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.977020 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnpck\" (UniqueName: \"kubernetes.io/projected/de8e3974-4d05-40e2-9306-01dd34663e53-kube-api-access-gnpck\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.982231 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:44 crc kubenswrapper[4931]: I1201 15:23:44.985731 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8e3974-4d05-40e2-9306-01dd34663e53-config-data\") pod \"ceilometer-0\" (UID: \"de8e3974-4d05-40e2-9306-01dd34663e53\") " pod="openstack/ceilometer-0" Dec 01 15:23:45 crc kubenswrapper[4931]: I1201 15:23:45.097932 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 15:23:45 crc kubenswrapper[4931]: I1201 15:23:45.320499 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-shbng"] Dec 01 15:23:45 crc kubenswrapper[4931]: W1201 15:23:45.331331 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabefeaeb_e647_4548_969b_4be700456f44.slice/crio-a8bddc49c07cc2a69eaf0dff7130b9a2df230a109694436df25dbef04ae46a51 WatchSource:0}: Error finding container a8bddc49c07cc2a69eaf0dff7130b9a2df230a109694436df25dbef04ae46a51: Status 404 returned error can't find the container with id a8bddc49c07cc2a69eaf0dff7130b9a2df230a109694436df25dbef04ae46a51 Dec 01 15:23:45 crc kubenswrapper[4931]: I1201 15:23:45.379632 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 15:23:45 crc kubenswrapper[4931]: I1201 15:23:45.400249 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-shbng" event={"ID":"abefeaeb-e647-4548-969b-4be700456f44","Type":"ContainerStarted","Data":"a8bddc49c07cc2a69eaf0dff7130b9a2df230a109694436df25dbef04ae46a51"} Dec 01 15:23:46 crc kubenswrapper[4931]: I1201 15:23:46.252366 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25133581-9318-4292-acb4-38574d51628a" path="/var/lib/kubelet/pods/25133581-9318-4292-acb4-38574d51628a/volumes" Dec 01 15:23:46 crc kubenswrapper[4931]: I1201 15:23:46.409220 4931 generic.go:334] "Generic (PLEG): container finished" podID="abefeaeb-e647-4548-969b-4be700456f44" containerID="771c0e281c3015f55124fd57c48461f2e5a1589e23abfadb51dfca3ba09c28c5" exitCode=0 Dec 01 15:23:46 crc kubenswrapper[4931]: I1201 15:23:46.409310 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-shbng" event={"ID":"abefeaeb-e647-4548-969b-4be700456f44","Type":"ContainerDied","Data":"771c0e281c3015f55124fd57c48461f2e5a1589e23abfadb51dfca3ba09c28c5"} Dec 01 15:23:46 crc kubenswrapper[4931]: I1201 15:23:46.410204 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8e3974-4d05-40e2-9306-01dd34663e53","Type":"ContainerStarted","Data":"6f12301af048479526ea3254e63037bf7091fc8df9011c4c7a69efc338eb8d16"} Dec 01 15:23:47 crc kubenswrapper[4931]: I1201 15:23:47.438528 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8e3974-4d05-40e2-9306-01dd34663e53","Type":"ContainerStarted","Data":"6383ab9db7ed4e8ad7d43a4ef43552a12ae9968e49406cb9472d69e59c52a707"} Dec 01 15:23:49 crc kubenswrapper[4931]: I1201 15:23:49.459370 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8e3974-4d05-40e2-9306-01dd34663e53","Type":"ContainerStarted","Data":"619fc1a60bdcd21379f223ff85b45e2136a0e7a1bab84d8e19b05224b7e6dbf2"} Dec 01 15:23:50 crc kubenswrapper[4931]: I1201 15:23:50.592362 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 15:24:00 crc kubenswrapper[4931]: I1201 15:24:00.561828 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8e3974-4d05-40e2-9306-01dd34663e53","Type":"ContainerStarted","Data":"02797475e71f25b1ed891e5566f0368666ae2dea1a6fee7dd4f974c6251a8b07"} Dec 01 15:24:01 crc kubenswrapper[4931]: I1201 15:24:01.578771 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-shbng" event={"ID":"abefeaeb-e647-4548-969b-4be700456f44","Type":"ContainerStarted","Data":"e2fec5efbb4b16c1c4386106682ebe555c509f67f3787d5ee425debd85047e03"} Dec 01 15:24:02 crc kubenswrapper[4931]: I1201 15:24:02.590796 4931 generic.go:334] "Generic (PLEG): container finished" podID="abefeaeb-e647-4548-969b-4be700456f44" containerID="e2fec5efbb4b16c1c4386106682ebe555c509f67f3787d5ee425debd85047e03" exitCode=0 Dec 01 15:24:02 crc kubenswrapper[4931]: I1201 15:24:02.590846 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-shbng" event={"ID":"abefeaeb-e647-4548-969b-4be700456f44","Type":"ContainerDied","Data":"e2fec5efbb4b16c1c4386106682ebe555c509f67f3787d5ee425debd85047e03"} Dec 01 15:24:04 crc kubenswrapper[4931]: I1201 15:24:04.481822 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p6ng7"] Dec 01 15:24:04 crc kubenswrapper[4931]: I1201 15:24:04.487403 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:04 crc kubenswrapper[4931]: I1201 15:24:04.518088 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6ng7"] Dec 01 15:24:04 crc kubenswrapper[4931]: I1201 15:24:04.626899 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g47kj\" (UniqueName: \"kubernetes.io/projected/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-kube-api-access-g47kj\") pod \"certified-operators-p6ng7\" (UID: \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\") " pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:04 crc kubenswrapper[4931]: I1201 15:24:04.626985 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-catalog-content\") pod \"certified-operators-p6ng7\" (UID: \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\") " pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:04 crc kubenswrapper[4931]: I1201 15:24:04.627024 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-utilities\") pod \"certified-operators-p6ng7\" (UID: \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\") " pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:04 crc kubenswrapper[4931]: I1201 15:24:04.729266 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g47kj\" (UniqueName: \"kubernetes.io/projected/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-kube-api-access-g47kj\") pod \"certified-operators-p6ng7\" (UID: \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\") " pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:04 crc kubenswrapper[4931]: I1201 15:24:04.729337 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-catalog-content\") pod \"certified-operators-p6ng7\" (UID: \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\") " pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:04 crc kubenswrapper[4931]: I1201 15:24:04.729357 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-utilities\") pod \"certified-operators-p6ng7\" (UID: \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\") " pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:04 crc kubenswrapper[4931]: I1201 15:24:04.729844 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-catalog-content\") pod \"certified-operators-p6ng7\" (UID: \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\") " pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:04 crc kubenswrapper[4931]: I1201 15:24:04.729886 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-utilities\") pod \"certified-operators-p6ng7\" (UID: \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\") " pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:04 crc kubenswrapper[4931]: I1201 15:24:04.748307 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g47kj\" (UniqueName: \"kubernetes.io/projected/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-kube-api-access-g47kj\") pod \"certified-operators-p6ng7\" (UID: \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\") " pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:04 crc kubenswrapper[4931]: I1201 15:24:04.815977 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:05 crc kubenswrapper[4931]: I1201 15:24:05.405185 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p6ng7"] Dec 01 15:24:05 crc kubenswrapper[4931]: I1201 15:24:05.626643 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6ng7" event={"ID":"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb","Type":"ContainerStarted","Data":"fea3d10ca3da0630f6253d40e1fccfb9f1f47c501eaacfcc7fa8dc9770831cda"} Dec 01 15:24:07 crc kubenswrapper[4931]: I1201 15:24:07.646138 4931 generic.go:334] "Generic (PLEG): container finished" podID="bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" containerID="9424421259ae0967d4f96c899f51694f710f1a8bbac2a6f938539f89d5e3d97d" exitCode=0 Dec 01 15:24:07 crc kubenswrapper[4931]: I1201 15:24:07.646637 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6ng7" event={"ID":"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb","Type":"ContainerDied","Data":"9424421259ae0967d4f96c899f51694f710f1a8bbac2a6f938539f89d5e3d97d"} Dec 01 15:24:07 crc kubenswrapper[4931]: E1201 15:24:07.829036 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfbec1c8_58b4_4c0d_a4a0_d4cb8d0b25eb.slice/crio-conmon-9424421259ae0967d4f96c899f51694f710f1a8bbac2a6f938539f89d5e3d97d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfbec1c8_58b4_4c0d_a4a0_d4cb8d0b25eb.slice/crio-9424421259ae0967d4f96c899f51694f710f1a8bbac2a6f938539f89d5e3d97d.scope\": RecentStats: unable to find data in memory cache]" Dec 01 15:24:08 crc kubenswrapper[4931]: I1201 15:24:08.657436 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8e3974-4d05-40e2-9306-01dd34663e53","Type":"ContainerStarted","Data":"dc1df62950f542cbb081f49356c743e0c2cb1e712ecc71934b5bc9f6de2a7a2a"} Dec 01 15:24:08 crc kubenswrapper[4931]: I1201 15:24:08.659568 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 15:24:08 crc kubenswrapper[4931]: I1201 15:24:08.662622 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-shbng" event={"ID":"abefeaeb-e647-4548-969b-4be700456f44","Type":"ContainerStarted","Data":"392d69b275f4826a6e030d6413161409e7a2f8b2da3b262dd208dc01f6e13d5c"} Dec 01 15:24:08 crc kubenswrapper[4931]: I1201 15:24:08.685979 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.43575139 podStartE2EDuration="24.685955722s" podCreationTimestamp="2025-12-01 15:23:44 +0000 UTC" firstStartedPulling="2025-12-01 15:23:45.392476049 +0000 UTC m=+1371.818349716" lastFinishedPulling="2025-12-01 15:24:07.642680381 +0000 UTC m=+1394.068554048" observedRunningTime="2025-12-01 15:24:08.677517073 +0000 UTC m=+1395.103390750" watchObservedRunningTime="2025-12-01 15:24:08.685955722 +0000 UTC m=+1395.111829389" Dec 01 15:24:08 crc kubenswrapper[4931]: I1201 15:24:08.704757 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-shbng" podStartSLOduration=3.343059661 podStartE2EDuration="24.704735946s" podCreationTimestamp="2025-12-01 15:23:44 +0000 UTC" firstStartedPulling="2025-12-01 15:23:46.41248471 +0000 UTC m=+1372.838358377" lastFinishedPulling="2025-12-01 15:24:07.774160985 +0000 UTC m=+1394.200034662" observedRunningTime="2025-12-01 15:24:08.695883574 +0000 UTC m=+1395.121757241" watchObservedRunningTime="2025-12-01 15:24:08.704735946 +0000 UTC m=+1395.130609603" Dec 01 15:24:09 crc kubenswrapper[4931]: I1201 15:24:09.672685 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6ng7" event={"ID":"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb","Type":"ContainerStarted","Data":"202dc649ddd56d250a941d64d4fad2317f97938ec7e93d2593daf311fa14d2a4"} Dec 01 15:24:11 crc kubenswrapper[4931]: I1201 15:24:11.690640 4931 generic.go:334] "Generic (PLEG): container finished" podID="bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" containerID="202dc649ddd56d250a941d64d4fad2317f97938ec7e93d2593daf311fa14d2a4" exitCode=0 Dec 01 15:24:11 crc kubenswrapper[4931]: I1201 15:24:11.690740 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6ng7" event={"ID":"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb","Type":"ContainerDied","Data":"202dc649ddd56d250a941d64d4fad2317f97938ec7e93d2593daf311fa14d2a4"} Dec 01 15:24:12 crc kubenswrapper[4931]: I1201 15:24:12.701460 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6ng7" event={"ID":"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb","Type":"ContainerStarted","Data":"12e7ca238b60e5f4f7529bbbf7509c436a1e1c1d6518648e299c663b2e0910cf"} Dec 01 15:24:12 crc kubenswrapper[4931]: I1201 15:24:12.721669 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p6ng7" podStartSLOduration=4.227217973 podStartE2EDuration="8.721652814s" podCreationTimestamp="2025-12-01 15:24:04 +0000 UTC" firstStartedPulling="2025-12-01 15:24:07.651695697 +0000 UTC m=+1394.077569364" lastFinishedPulling="2025-12-01 15:24:12.146130538 +0000 UTC m=+1398.572004205" observedRunningTime="2025-12-01 15:24:12.720960824 +0000 UTC m=+1399.146834501" watchObservedRunningTime="2025-12-01 15:24:12.721652814 +0000 UTC m=+1399.147526481" Dec 01 15:24:14 crc kubenswrapper[4931]: I1201 15:24:14.816617 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:14 crc kubenswrapper[4931]: I1201 15:24:14.817001 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:14 crc kubenswrapper[4931]: I1201 15:24:14.824155 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:24:14 crc kubenswrapper[4931]: I1201 15:24:14.824495 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:24:14 crc kubenswrapper[4931]: I1201 15:24:14.868497 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:15 crc kubenswrapper[4931]: I1201 15:24:15.876623 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-shbng" podUID="abefeaeb-e647-4548-969b-4be700456f44" containerName="registry-server" probeResult="failure" output=< Dec 01 15:24:15 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Dec 01 15:24:15 crc kubenswrapper[4931]: > Dec 01 15:24:19 crc kubenswrapper[4931]: I1201 15:24:19.872077 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:24:19 crc kubenswrapper[4931]: I1201 15:24:19.872568 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:24:24 crc kubenswrapper[4931]: I1201 15:24:24.882933 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:24 crc kubenswrapper[4931]: I1201 15:24:24.887696 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:24:24 crc kubenswrapper[4931]: I1201 15:24:24.949591 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p6ng7"] Dec 01 15:24:24 crc kubenswrapper[4931]: I1201 15:24:24.960692 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:24:25 crc kubenswrapper[4931]: I1201 15:24:25.835688 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p6ng7" podUID="bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" containerName="registry-server" containerID="cri-o://12e7ca238b60e5f4f7529bbbf7509c436a1e1c1d6518648e299c663b2e0910cf" gracePeriod=2 Dec 01 15:24:25 crc kubenswrapper[4931]: I1201 15:24:25.930702 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-shbng"] Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.791076 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.848876 4931 generic.go:334] "Generic (PLEG): container finished" podID="bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" containerID="12e7ca238b60e5f4f7529bbbf7509c436a1e1c1d6518648e299c663b2e0910cf" exitCode=0 Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.848950 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p6ng7" Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.848961 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6ng7" event={"ID":"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb","Type":"ContainerDied","Data":"12e7ca238b60e5f4f7529bbbf7509c436a1e1c1d6518648e299c663b2e0910cf"} Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.849001 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p6ng7" event={"ID":"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb","Type":"ContainerDied","Data":"fea3d10ca3da0630f6253d40e1fccfb9f1f47c501eaacfcc7fa8dc9770831cda"} Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.849020 4931 scope.go:117] "RemoveContainer" containerID="12e7ca238b60e5f4f7529bbbf7509c436a1e1c1d6518648e299c663b2e0910cf" Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.849454 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-shbng" podUID="abefeaeb-e647-4548-969b-4be700456f44" containerName="registry-server" containerID="cri-o://392d69b275f4826a6e030d6413161409e7a2f8b2da3b262dd208dc01f6e13d5c" gracePeriod=2 Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.880494 4931 scope.go:117] "RemoveContainer" containerID="202dc649ddd56d250a941d64d4fad2317f97938ec7e93d2593daf311fa14d2a4" Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.901370 4931 scope.go:117] "RemoveContainer" containerID="9424421259ae0967d4f96c899f51694f710f1a8bbac2a6f938539f89d5e3d97d" Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.956556 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-catalog-content\") pod \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\" (UID: \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\") " Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.956621 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-utilities\") pod \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\" (UID: \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\") " Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.956786 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g47kj\" (UniqueName: \"kubernetes.io/projected/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-kube-api-access-g47kj\") pod \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\" (UID: \"bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb\") " Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.958057 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-utilities" (OuterVolumeSpecName: "utilities") pod "bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" (UID: "bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:24:26 crc kubenswrapper[4931]: I1201 15:24:26.968679 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-kube-api-access-g47kj" (OuterVolumeSpecName: "kube-api-access-g47kj") pod "bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" (UID: "bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb"). InnerVolumeSpecName "kube-api-access-g47kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.005369 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" (UID: "bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.058991 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.059020 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.059029 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g47kj\" (UniqueName: \"kubernetes.io/projected/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb-kube-api-access-g47kj\") on node \"crc\" DevicePath \"\"" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.083628 4931 scope.go:117] "RemoveContainer" containerID="12e7ca238b60e5f4f7529bbbf7509c436a1e1c1d6518648e299c663b2e0910cf" Dec 01 15:24:27 crc kubenswrapper[4931]: E1201 15:24:27.086478 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e7ca238b60e5f4f7529bbbf7509c436a1e1c1d6518648e299c663b2e0910cf\": container with ID starting with 12e7ca238b60e5f4f7529bbbf7509c436a1e1c1d6518648e299c663b2e0910cf not found: ID does not exist" containerID="12e7ca238b60e5f4f7529bbbf7509c436a1e1c1d6518648e299c663b2e0910cf" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.086526 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e7ca238b60e5f4f7529bbbf7509c436a1e1c1d6518648e299c663b2e0910cf"} err="failed to get container status \"12e7ca238b60e5f4f7529bbbf7509c436a1e1c1d6518648e299c663b2e0910cf\": rpc error: code = NotFound desc = could not find container \"12e7ca238b60e5f4f7529bbbf7509c436a1e1c1d6518648e299c663b2e0910cf\": container with ID starting with 12e7ca238b60e5f4f7529bbbf7509c436a1e1c1d6518648e299c663b2e0910cf not found: ID does not exist" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.086559 4931 scope.go:117] "RemoveContainer" containerID="202dc649ddd56d250a941d64d4fad2317f97938ec7e93d2593daf311fa14d2a4" Dec 01 15:24:27 crc kubenswrapper[4931]: E1201 15:24:27.087194 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"202dc649ddd56d250a941d64d4fad2317f97938ec7e93d2593daf311fa14d2a4\": container with ID starting with 202dc649ddd56d250a941d64d4fad2317f97938ec7e93d2593daf311fa14d2a4 not found: ID does not exist" containerID="202dc649ddd56d250a941d64d4fad2317f97938ec7e93d2593daf311fa14d2a4" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.087247 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"202dc649ddd56d250a941d64d4fad2317f97938ec7e93d2593daf311fa14d2a4"} err="failed to get container status \"202dc649ddd56d250a941d64d4fad2317f97938ec7e93d2593daf311fa14d2a4\": rpc error: code = NotFound desc = could not find container \"202dc649ddd56d250a941d64d4fad2317f97938ec7e93d2593daf311fa14d2a4\": container with ID starting with 202dc649ddd56d250a941d64d4fad2317f97938ec7e93d2593daf311fa14d2a4 not found: ID does not exist" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.087279 4931 scope.go:117] "RemoveContainer" containerID="9424421259ae0967d4f96c899f51694f710f1a8bbac2a6f938539f89d5e3d97d" Dec 01 15:24:27 crc kubenswrapper[4931]: E1201 15:24:27.087661 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9424421259ae0967d4f96c899f51694f710f1a8bbac2a6f938539f89d5e3d97d\": container with ID starting with 9424421259ae0967d4f96c899f51694f710f1a8bbac2a6f938539f89d5e3d97d not found: ID does not exist" containerID="9424421259ae0967d4f96c899f51694f710f1a8bbac2a6f938539f89d5e3d97d" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.087705 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9424421259ae0967d4f96c899f51694f710f1a8bbac2a6f938539f89d5e3d97d"} err="failed to get container status \"9424421259ae0967d4f96c899f51694f710f1a8bbac2a6f938539f89d5e3d97d\": rpc error: code = NotFound desc = could not find container \"9424421259ae0967d4f96c899f51694f710f1a8bbac2a6f938539f89d5e3d97d\": container with ID starting with 9424421259ae0967d4f96c899f51694f710f1a8bbac2a6f938539f89d5e3d97d not found: ID does not exist" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.183187 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p6ng7"] Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.191304 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p6ng7"] Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.271824 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.363608 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tpdp\" (UniqueName: \"kubernetes.io/projected/abefeaeb-e647-4548-969b-4be700456f44-kube-api-access-6tpdp\") pod \"abefeaeb-e647-4548-969b-4be700456f44\" (UID: \"abefeaeb-e647-4548-969b-4be700456f44\") " Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.363934 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abefeaeb-e647-4548-969b-4be700456f44-catalog-content\") pod \"abefeaeb-e647-4548-969b-4be700456f44\" (UID: \"abefeaeb-e647-4548-969b-4be700456f44\") " Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.363995 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abefeaeb-e647-4548-969b-4be700456f44-utilities\") pod \"abefeaeb-e647-4548-969b-4be700456f44\" (UID: \"abefeaeb-e647-4548-969b-4be700456f44\") " Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.364957 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abefeaeb-e647-4548-969b-4be700456f44-utilities" (OuterVolumeSpecName: "utilities") pod "abefeaeb-e647-4548-969b-4be700456f44" (UID: "abefeaeb-e647-4548-969b-4be700456f44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.368449 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abefeaeb-e647-4548-969b-4be700456f44-kube-api-access-6tpdp" (OuterVolumeSpecName: "kube-api-access-6tpdp") pod "abefeaeb-e647-4548-969b-4be700456f44" (UID: "abefeaeb-e647-4548-969b-4be700456f44"). InnerVolumeSpecName "kube-api-access-6tpdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.466814 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tpdp\" (UniqueName: \"kubernetes.io/projected/abefeaeb-e647-4548-969b-4be700456f44-kube-api-access-6tpdp\") on node \"crc\" DevicePath \"\"" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.466843 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abefeaeb-e647-4548-969b-4be700456f44-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.483950 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abefeaeb-e647-4548-969b-4be700456f44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abefeaeb-e647-4548-969b-4be700456f44" (UID: "abefeaeb-e647-4548-969b-4be700456f44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.569125 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abefeaeb-e647-4548-969b-4be700456f44-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.860319 4931 generic.go:334] "Generic (PLEG): container finished" podID="abefeaeb-e647-4548-969b-4be700456f44" containerID="392d69b275f4826a6e030d6413161409e7a2f8b2da3b262dd208dc01f6e13d5c" exitCode=0 Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.860372 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-shbng" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.860402 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-shbng" event={"ID":"abefeaeb-e647-4548-969b-4be700456f44","Type":"ContainerDied","Data":"392d69b275f4826a6e030d6413161409e7a2f8b2da3b262dd208dc01f6e13d5c"} Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.860435 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-shbng" event={"ID":"abefeaeb-e647-4548-969b-4be700456f44","Type":"ContainerDied","Data":"a8bddc49c07cc2a69eaf0dff7130b9a2df230a109694436df25dbef04ae46a51"} Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.860453 4931 scope.go:117] "RemoveContainer" containerID="392d69b275f4826a6e030d6413161409e7a2f8b2da3b262dd208dc01f6e13d5c" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.881154 4931 scope.go:117] "RemoveContainer" containerID="e2fec5efbb4b16c1c4386106682ebe555c509f67f3787d5ee425debd85047e03" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.889626 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-shbng"] Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.897834 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-shbng"] Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.917882 4931 scope.go:117] "RemoveContainer" containerID="771c0e281c3015f55124fd57c48461f2e5a1589e23abfadb51dfca3ba09c28c5" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.934077 4931 scope.go:117] "RemoveContainer" containerID="392d69b275f4826a6e030d6413161409e7a2f8b2da3b262dd208dc01f6e13d5c" Dec 01 15:24:27 crc kubenswrapper[4931]: E1201 15:24:27.934530 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"392d69b275f4826a6e030d6413161409e7a2f8b2da3b262dd208dc01f6e13d5c\": container with ID starting with 392d69b275f4826a6e030d6413161409e7a2f8b2da3b262dd208dc01f6e13d5c not found: ID does not exist" containerID="392d69b275f4826a6e030d6413161409e7a2f8b2da3b262dd208dc01f6e13d5c" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.934574 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"392d69b275f4826a6e030d6413161409e7a2f8b2da3b262dd208dc01f6e13d5c"} err="failed to get container status \"392d69b275f4826a6e030d6413161409e7a2f8b2da3b262dd208dc01f6e13d5c\": rpc error: code = NotFound desc = could not find container \"392d69b275f4826a6e030d6413161409e7a2f8b2da3b262dd208dc01f6e13d5c\": container with ID starting with 392d69b275f4826a6e030d6413161409e7a2f8b2da3b262dd208dc01f6e13d5c not found: ID does not exist" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.934602 4931 scope.go:117] "RemoveContainer" containerID="e2fec5efbb4b16c1c4386106682ebe555c509f67f3787d5ee425debd85047e03" Dec 01 15:24:27 crc kubenswrapper[4931]: E1201 15:24:27.934912 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2fec5efbb4b16c1c4386106682ebe555c509f67f3787d5ee425debd85047e03\": container with ID starting with e2fec5efbb4b16c1c4386106682ebe555c509f67f3787d5ee425debd85047e03 not found: ID does not exist" containerID="e2fec5efbb4b16c1c4386106682ebe555c509f67f3787d5ee425debd85047e03" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.934932 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2fec5efbb4b16c1c4386106682ebe555c509f67f3787d5ee425debd85047e03"} err="failed to get container status \"e2fec5efbb4b16c1c4386106682ebe555c509f67f3787d5ee425debd85047e03\": rpc error: code = NotFound desc = could not find container \"e2fec5efbb4b16c1c4386106682ebe555c509f67f3787d5ee425debd85047e03\": container with ID starting with e2fec5efbb4b16c1c4386106682ebe555c509f67f3787d5ee425debd85047e03 not found: ID does not exist" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.934949 4931 scope.go:117] "RemoveContainer" containerID="771c0e281c3015f55124fd57c48461f2e5a1589e23abfadb51dfca3ba09c28c5" Dec 01 15:24:27 crc kubenswrapper[4931]: E1201 15:24:27.935207 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"771c0e281c3015f55124fd57c48461f2e5a1589e23abfadb51dfca3ba09c28c5\": container with ID starting with 771c0e281c3015f55124fd57c48461f2e5a1589e23abfadb51dfca3ba09c28c5 not found: ID does not exist" containerID="771c0e281c3015f55124fd57c48461f2e5a1589e23abfadb51dfca3ba09c28c5" Dec 01 15:24:27 crc kubenswrapper[4931]: I1201 15:24:27.935244 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"771c0e281c3015f55124fd57c48461f2e5a1589e23abfadb51dfca3ba09c28c5"} err="failed to get container status \"771c0e281c3015f55124fd57c48461f2e5a1589e23abfadb51dfca3ba09c28c5\": rpc error: code = NotFound desc = could not find container \"771c0e281c3015f55124fd57c48461f2e5a1589e23abfadb51dfca3ba09c28c5\": container with ID starting with 771c0e281c3015f55124fd57c48461f2e5a1589e23abfadb51dfca3ba09c28c5 not found: ID does not exist" Dec 01 15:24:28 crc kubenswrapper[4931]: I1201 15:24:28.253287 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abefeaeb-e647-4548-969b-4be700456f44" path="/var/lib/kubelet/pods/abefeaeb-e647-4548-969b-4be700456f44/volumes" Dec 01 15:24:28 crc kubenswrapper[4931]: I1201 15:24:28.254768 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" path="/var/lib/kubelet/pods/bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb/volumes" Dec 01 15:24:45 crc kubenswrapper[4931]: I1201 15:24:45.106240 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 15:24:49 crc kubenswrapper[4931]: I1201 15:24:49.872030 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:24:49 crc kubenswrapper[4931]: I1201 15:24:49.872380 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:24:54 crc kubenswrapper[4931]: I1201 15:24:54.440498 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 15:24:55 crc kubenswrapper[4931]: I1201 15:24:55.842139 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 15:24:58 crc kubenswrapper[4931]: I1201 15:24:58.483912 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="6f8b18d2-d611-4ad6-850a-4ad19544c016" containerName="rabbitmq" containerID="cri-o://46f10de2c518f8c69c966deba43efad45397695eb10c52be47377627996f1270" gracePeriod=604796 Dec 01 15:24:59 crc kubenswrapper[4931]: I1201 15:24:59.813968 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a675ebc0-8c3b-4c43-884f-b32bd954ac6e" containerName="rabbitmq" containerID="cri-o://daa265a5e267895db3fb7f5c0262a02c5a6287dc51eafab8b85e220f1bc8011e" gracePeriod=604797 Dec 01 15:25:00 crc kubenswrapper[4931]: I1201 15:25:00.823951 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6f8b18d2-d611-4ad6-850a-4ad19544c016" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Dec 01 15:25:01 crc kubenswrapper[4931]: I1201 15:25:01.154546 4931 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a675ebc0-8c3b-4c43-884f-b32bd954ac6e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 01 15:25:04 crc kubenswrapper[4931]: I1201 15:25:04.804892 4931 generic.go:334] "Generic (PLEG): container finished" podID="6f8b18d2-d611-4ad6-850a-4ad19544c016" containerID="46f10de2c518f8c69c966deba43efad45397695eb10c52be47377627996f1270" exitCode=0 Dec 01 15:25:04 crc kubenswrapper[4931]: I1201 15:25:04.804984 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f8b18d2-d611-4ad6-850a-4ad19544c016","Type":"ContainerDied","Data":"46f10de2c518f8c69c966deba43efad45397695eb10c52be47377627996f1270"} Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.027254 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.218812 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-tls\") pod \"6f8b18d2-d611-4ad6-850a-4ad19544c016\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.218865 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-erlang-cookie\") pod \"6f8b18d2-d611-4ad6-850a-4ad19544c016\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.218890 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-confd\") pod \"6f8b18d2-d611-4ad6-850a-4ad19544c016\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.218907 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv89w\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-kube-api-access-xv89w\") pod \"6f8b18d2-d611-4ad6-850a-4ad19544c016\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.218952 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-server-conf\") pod \"6f8b18d2-d611-4ad6-850a-4ad19544c016\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.219045 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f8b18d2-d611-4ad6-850a-4ad19544c016-erlang-cookie-secret\") pod \"6f8b18d2-d611-4ad6-850a-4ad19544c016\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.219083 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-plugins\") pod \"6f8b18d2-d611-4ad6-850a-4ad19544c016\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.219121 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"6f8b18d2-d611-4ad6-850a-4ad19544c016\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.219149 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-plugins-conf\") pod \"6f8b18d2-d611-4ad6-850a-4ad19544c016\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.219178 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f8b18d2-d611-4ad6-850a-4ad19544c016-pod-info\") pod \"6f8b18d2-d611-4ad6-850a-4ad19544c016\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.219200 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-config-data\") pod \"6f8b18d2-d611-4ad6-850a-4ad19544c016\" (UID: \"6f8b18d2-d611-4ad6-850a-4ad19544c016\") " Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.220141 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6f8b18d2-d611-4ad6-850a-4ad19544c016" (UID: "6f8b18d2-d611-4ad6-850a-4ad19544c016"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.220263 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6f8b18d2-d611-4ad6-850a-4ad19544c016" (UID: "6f8b18d2-d611-4ad6-850a-4ad19544c016"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.220281 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6f8b18d2-d611-4ad6-850a-4ad19544c016" (UID: "6f8b18d2-d611-4ad6-850a-4ad19544c016"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.227690 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f8b18d2-d611-4ad6-850a-4ad19544c016-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6f8b18d2-d611-4ad6-850a-4ad19544c016" (UID: "6f8b18d2-d611-4ad6-850a-4ad19544c016"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.245003 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "6f8b18d2-d611-4ad6-850a-4ad19544c016" (UID: "6f8b18d2-d611-4ad6-850a-4ad19544c016"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.245543 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6f8b18d2-d611-4ad6-850a-4ad19544c016" (UID: "6f8b18d2-d611-4ad6-850a-4ad19544c016"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.245594 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-kube-api-access-xv89w" (OuterVolumeSpecName: "kube-api-access-xv89w") pod "6f8b18d2-d611-4ad6-850a-4ad19544c016" (UID: "6f8b18d2-d611-4ad6-850a-4ad19544c016"). InnerVolumeSpecName "kube-api-access-xv89w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.246047 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6f8b18d2-d611-4ad6-850a-4ad19544c016-pod-info" (OuterVolumeSpecName: "pod-info") pod "6f8b18d2-d611-4ad6-850a-4ad19544c016" (UID: "6f8b18d2-d611-4ad6-850a-4ad19544c016"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.262835 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-config-data" (OuterVolumeSpecName: "config-data") pod "6f8b18d2-d611-4ad6-850a-4ad19544c016" (UID: "6f8b18d2-d611-4ad6-850a-4ad19544c016"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.288920 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-server-conf" (OuterVolumeSpecName: "server-conf") pod "6f8b18d2-d611-4ad6-850a-4ad19544c016" (UID: "6f8b18d2-d611-4ad6-850a-4ad19544c016"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.322553 4931 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f8b18d2-d611-4ad6-850a-4ad19544c016-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.322588 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.322619 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.322631 4931 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.322642 4931 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f8b18d2-d611-4ad6-850a-4ad19544c016-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.322657 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.322668 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.322679 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.322691 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv89w\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-kube-api-access-xv89w\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.322701 4931 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f8b18d2-d611-4ad6-850a-4ad19544c016-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.333109 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6f8b18d2-d611-4ad6-850a-4ad19544c016" (UID: "6f8b18d2-d611-4ad6-850a-4ad19544c016"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.346017 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.424184 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.424225 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f8b18d2-d611-4ad6-850a-4ad19544c016-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.815249 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f8b18d2-d611-4ad6-850a-4ad19544c016","Type":"ContainerDied","Data":"9b0f0536388d6ff5ac626711009a78962f7fcf589407a339a8862800618100b5"} Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.815611 4931 scope.go:117] "RemoveContainer" containerID="46f10de2c518f8c69c966deba43efad45397695eb10c52be47377627996f1270" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.815346 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.852020 4931 scope.go:117] "RemoveContainer" containerID="c4f945b9f807d6cde63f94d15690bb2e2f0ad176652953dc94b9fb21983d50d3" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.855464 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.863925 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.887587 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 15:25:05 crc kubenswrapper[4931]: E1201 15:25:05.888131 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f8b18d2-d611-4ad6-850a-4ad19544c016" containerName="rabbitmq" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.888149 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f8b18d2-d611-4ad6-850a-4ad19544c016" containerName="rabbitmq" Dec 01 15:25:05 crc kubenswrapper[4931]: E1201 15:25:05.888170 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" containerName="extract-content" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.888178 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" containerName="extract-content" Dec 01 15:25:05 crc kubenswrapper[4931]: E1201 15:25:05.888200 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abefeaeb-e647-4548-969b-4be700456f44" containerName="extract-utilities" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.888210 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="abefeaeb-e647-4548-969b-4be700456f44" containerName="extract-utilities" Dec 01 15:25:05 crc kubenswrapper[4931]: E1201 15:25:05.888223 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abefeaeb-e647-4548-969b-4be700456f44" containerName="registry-server" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.888230 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="abefeaeb-e647-4548-969b-4be700456f44" containerName="registry-server" Dec 01 15:25:05 crc kubenswrapper[4931]: E1201 15:25:05.888241 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" containerName="registry-server" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.888250 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" containerName="registry-server" Dec 01 15:25:05 crc kubenswrapper[4931]: E1201 15:25:05.888277 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abefeaeb-e647-4548-969b-4be700456f44" containerName="extract-content" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.888284 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="abefeaeb-e647-4548-969b-4be700456f44" containerName="extract-content" Dec 01 15:25:05 crc kubenswrapper[4931]: E1201 15:25:05.888295 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f8b18d2-d611-4ad6-850a-4ad19544c016" containerName="setup-container" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.888301 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f8b18d2-d611-4ad6-850a-4ad19544c016" containerName="setup-container" Dec 01 15:25:05 crc kubenswrapper[4931]: E1201 15:25:05.888319 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" containerName="extract-utilities" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.888325 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" containerName="extract-utilities" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.888551 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfbec1c8-58b4-4c0d-a4a0-d4cb8d0b25eb" containerName="registry-server" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.888586 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f8b18d2-d611-4ad6-850a-4ad19544c016" containerName="rabbitmq" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.888595 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="abefeaeb-e647-4548-969b-4be700456f44" containerName="registry-server" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.890561 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.896715 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.896825 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.897047 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.897191 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.897255 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-j6jsj" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.897353 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.897417 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 15:25:05 crc kubenswrapper[4931]: I1201 15:25:05.913199 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.035651 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.035707 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.035731 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.035754 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.035868 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-config-data\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.035899 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.036149 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.036236 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.036263 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.036335 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6cpr\" (UniqueName: \"kubernetes.io/projected/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-kube-api-access-j6cpr\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.036450 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.139007 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-config-data\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.139067 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.139129 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.139151 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.139168 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.139198 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6cpr\" (UniqueName: \"kubernetes.io/projected/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-kube-api-access-j6cpr\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.139244 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.139302 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.139330 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.139354 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.139380 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.139937 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.140121 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.140594 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.143363 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.143801 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-config-data\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.145895 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.146429 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.146958 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.154144 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.155213 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.155902 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6cpr\" (UniqueName: \"kubernetes.io/projected/bc2ff309-c81a-4d19-bfb0-99a4a975b70a-kube-api-access-j6cpr\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.180798 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"bc2ff309-c81a-4d19-bfb0-99a4a975b70a\") " pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.257186 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.268308 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f8b18d2-d611-4ad6-850a-4ad19544c016" path="/var/lib/kubelet/pods/6f8b18d2-d611-4ad6-850a-4ad19544c016/volumes" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.311163 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.460943 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-pod-info\") pod \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.461150 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-server-conf\") pod \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.461260 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-config-data\") pod \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.461286 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-plugins-conf\") pod \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.461480 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-erlang-cookie-secret\") pod \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.462102 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrdkt\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-kube-api-access-jrdkt\") pod \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.462249 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-tls\") pod \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.462379 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-confd\") pod \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.462427 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-erlang-cookie\") pod \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.462444 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-plugins\") pod \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.462460 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\" (UID: \"a675ebc0-8c3b-4c43-884f-b32bd954ac6e\") " Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.462001 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a675ebc0-8c3b-4c43-884f-b32bd954ac6e" (UID: "a675ebc0-8c3b-4c43-884f-b32bd954ac6e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.466900 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a675ebc0-8c3b-4c43-884f-b32bd954ac6e" (UID: "a675ebc0-8c3b-4c43-884f-b32bd954ac6e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.468676 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a675ebc0-8c3b-4c43-884f-b32bd954ac6e" (UID: "a675ebc0-8c3b-4c43-884f-b32bd954ac6e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.469098 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a675ebc0-8c3b-4c43-884f-b32bd954ac6e" (UID: "a675ebc0-8c3b-4c43-884f-b32bd954ac6e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.498797 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "a675ebc0-8c3b-4c43-884f-b32bd954ac6e" (UID: "a675ebc0-8c3b-4c43-884f-b32bd954ac6e"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.498846 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-kube-api-access-jrdkt" (OuterVolumeSpecName: "kube-api-access-jrdkt") pod "a675ebc0-8c3b-4c43-884f-b32bd954ac6e" (UID: "a675ebc0-8c3b-4c43-884f-b32bd954ac6e"). InnerVolumeSpecName "kube-api-access-jrdkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.499030 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a675ebc0-8c3b-4c43-884f-b32bd954ac6e" (UID: "a675ebc0-8c3b-4c43-884f-b32bd954ac6e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.499023 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-pod-info" (OuterVolumeSpecName: "pod-info") pod "a675ebc0-8c3b-4c43-884f-b32bd954ac6e" (UID: "a675ebc0-8c3b-4c43-884f-b32bd954ac6e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.513869 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-config-data" (OuterVolumeSpecName: "config-data") pod "a675ebc0-8c3b-4c43-884f-b32bd954ac6e" (UID: "a675ebc0-8c3b-4c43-884f-b32bd954ac6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.527074 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-server-conf" (OuterVolumeSpecName: "server-conf") pod "a675ebc0-8c3b-4c43-884f-b32bd954ac6e" (UID: "a675ebc0-8c3b-4c43-884f-b32bd954ac6e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.565609 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.565642 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.565668 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.565678 4931 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.565687 4931 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.565696 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.565706 4931 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.565714 4931 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.565724 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrdkt\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-kube-api-access-jrdkt\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.565732 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.573186 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a675ebc0-8c3b-4c43-884f-b32bd954ac6e" (UID: "a675ebc0-8c3b-4c43-884f-b32bd954ac6e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.594924 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.666789 4931 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a675ebc0-8c3b-4c43-884f-b32bd954ac6e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.667133 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.736439 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.829465 4931 generic.go:334] "Generic (PLEG): container finished" podID="a675ebc0-8c3b-4c43-884f-b32bd954ac6e" containerID="daa265a5e267895db3fb7f5c0262a02c5a6287dc51eafab8b85e220f1bc8011e" exitCode=0 Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.829529 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.829526 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a675ebc0-8c3b-4c43-884f-b32bd954ac6e","Type":"ContainerDied","Data":"daa265a5e267895db3fb7f5c0262a02c5a6287dc51eafab8b85e220f1bc8011e"} Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.829623 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a675ebc0-8c3b-4c43-884f-b32bd954ac6e","Type":"ContainerDied","Data":"88d79ebedca822ce7a61d0f8db4e42deba1eeca895ace38b33f13e029d308ddb"} Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.829642 4931 scope.go:117] "RemoveContainer" containerID="daa265a5e267895db3fb7f5c0262a02c5a6287dc51eafab8b85e220f1bc8011e" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.833722 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc2ff309-c81a-4d19-bfb0-99a4a975b70a","Type":"ContainerStarted","Data":"2198b00522078b20833136212fcd88085a9cbbb65b2be1f6a35840e797fb6c35"} Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.871505 4931 scope.go:117] "RemoveContainer" containerID="44f0e47d74078ec60006b9d6439ff27887fc34864e6507dde0fec8afdcf53cff" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.876039 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.887691 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.911554 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 15:25:06 crc kubenswrapper[4931]: E1201 15:25:06.911996 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a675ebc0-8c3b-4c43-884f-b32bd954ac6e" containerName="setup-container" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.912017 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a675ebc0-8c3b-4c43-884f-b32bd954ac6e" containerName="setup-container" Dec 01 15:25:06 crc kubenswrapper[4931]: E1201 15:25:06.912061 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a675ebc0-8c3b-4c43-884f-b32bd954ac6e" containerName="rabbitmq" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.912071 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a675ebc0-8c3b-4c43-884f-b32bd954ac6e" containerName="rabbitmq" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.912278 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a675ebc0-8c3b-4c43-884f-b32bd954ac6e" containerName="rabbitmq" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.913417 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.918450 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.918758 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.918959 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.919236 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.919511 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.919813 4931 scope.go:117] "RemoveContainer" containerID="daa265a5e267895db3fb7f5c0262a02c5a6287dc51eafab8b85e220f1bc8011e" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.920030 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ztd2n" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.920040 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 15:25:06 crc kubenswrapper[4931]: E1201 15:25:06.920677 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa265a5e267895db3fb7f5c0262a02c5a6287dc51eafab8b85e220f1bc8011e\": container with ID starting with daa265a5e267895db3fb7f5c0262a02c5a6287dc51eafab8b85e220f1bc8011e not found: ID does not exist" containerID="daa265a5e267895db3fb7f5c0262a02c5a6287dc51eafab8b85e220f1bc8011e" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.920717 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa265a5e267895db3fb7f5c0262a02c5a6287dc51eafab8b85e220f1bc8011e"} err="failed to get container status \"daa265a5e267895db3fb7f5c0262a02c5a6287dc51eafab8b85e220f1bc8011e\": rpc error: code = NotFound desc = could not find container \"daa265a5e267895db3fb7f5c0262a02c5a6287dc51eafab8b85e220f1bc8011e\": container with ID starting with daa265a5e267895db3fb7f5c0262a02c5a6287dc51eafab8b85e220f1bc8011e not found: ID does not exist" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.920745 4931 scope.go:117] "RemoveContainer" containerID="44f0e47d74078ec60006b9d6439ff27887fc34864e6507dde0fec8afdcf53cff" Dec 01 15:25:06 crc kubenswrapper[4931]: E1201 15:25:06.921078 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44f0e47d74078ec60006b9d6439ff27887fc34864e6507dde0fec8afdcf53cff\": container with ID starting with 44f0e47d74078ec60006b9d6439ff27887fc34864e6507dde0fec8afdcf53cff not found: ID does not exist" containerID="44f0e47d74078ec60006b9d6439ff27887fc34864e6507dde0fec8afdcf53cff" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.921127 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44f0e47d74078ec60006b9d6439ff27887fc34864e6507dde0fec8afdcf53cff"} err="failed to get container status \"44f0e47d74078ec60006b9d6439ff27887fc34864e6507dde0fec8afdcf53cff\": rpc error: code = NotFound desc = could not find container \"44f0e47d74078ec60006b9d6439ff27887fc34864e6507dde0fec8afdcf53cff\": container with ID starting with 44f0e47d74078ec60006b9d6439ff27887fc34864e6507dde0fec8afdcf53cff not found: ID does not exist" Dec 01 15:25:06 crc kubenswrapper[4931]: I1201 15:25:06.955850 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.074440 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b364f6e4-552e-435c-b684-d6ebbc851ef2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.074709 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b364f6e4-552e-435c-b684-d6ebbc851ef2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.074758 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.074822 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkf2x\" (UniqueName: \"kubernetes.io/projected/b364f6e4-552e-435c-b684-d6ebbc851ef2-kube-api-access-mkf2x\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.074839 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b364f6e4-552e-435c-b684-d6ebbc851ef2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.074866 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b364f6e4-552e-435c-b684-d6ebbc851ef2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.074883 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b364f6e4-552e-435c-b684-d6ebbc851ef2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.074899 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b364f6e4-552e-435c-b684-d6ebbc851ef2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.074918 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b364f6e4-552e-435c-b684-d6ebbc851ef2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.074961 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b364f6e4-552e-435c-b684-d6ebbc851ef2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.074996 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b364f6e4-552e-435c-b684-d6ebbc851ef2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.176627 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.176726 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkf2x\" (UniqueName: \"kubernetes.io/projected/b364f6e4-552e-435c-b684-d6ebbc851ef2-kube-api-access-mkf2x\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.176751 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b364f6e4-552e-435c-b684-d6ebbc851ef2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.176782 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b364f6e4-552e-435c-b684-d6ebbc851ef2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.176801 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b364f6e4-552e-435c-b684-d6ebbc851ef2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.176818 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b364f6e4-552e-435c-b684-d6ebbc851ef2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.176845 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b364f6e4-552e-435c-b684-d6ebbc851ef2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.176870 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b364f6e4-552e-435c-b684-d6ebbc851ef2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.176892 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b364f6e4-552e-435c-b684-d6ebbc851ef2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.176925 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b364f6e4-552e-435c-b684-d6ebbc851ef2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.176958 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b364f6e4-552e-435c-b684-d6ebbc851ef2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.177055 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.177303 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b364f6e4-552e-435c-b684-d6ebbc851ef2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.177611 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b364f6e4-552e-435c-b684-d6ebbc851ef2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.178581 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b364f6e4-552e-435c-b684-d6ebbc851ef2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.178627 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b364f6e4-552e-435c-b684-d6ebbc851ef2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.178717 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b364f6e4-552e-435c-b684-d6ebbc851ef2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.182041 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b364f6e4-552e-435c-b684-d6ebbc851ef2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.182066 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b364f6e4-552e-435c-b684-d6ebbc851ef2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.184362 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b364f6e4-552e-435c-b684-d6ebbc851ef2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.184464 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b364f6e4-552e-435c-b684-d6ebbc851ef2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.194745 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkf2x\" (UniqueName: \"kubernetes.io/projected/b364f6e4-552e-435c-b684-d6ebbc851ef2-kube-api-access-mkf2x\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.210563 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b364f6e4-552e-435c-b684-d6ebbc851ef2\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.265587 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.704573 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 15:25:07 crc kubenswrapper[4931]: W1201 15:25:07.705573 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb364f6e4_552e_435c_b684_d6ebbc851ef2.slice/crio-f34f8401e77c6c99ea5b1343107de1d937134220385ccc3f346fc88445ffd0ab WatchSource:0}: Error finding container f34f8401e77c6c99ea5b1343107de1d937134220385ccc3f346fc88445ffd0ab: Status 404 returned error can't find the container with id f34f8401e77c6c99ea5b1343107de1d937134220385ccc3f346fc88445ffd0ab Dec 01 15:25:07 crc kubenswrapper[4931]: I1201 15:25:07.842552 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b364f6e4-552e-435c-b684-d6ebbc851ef2","Type":"ContainerStarted","Data":"f34f8401e77c6c99ea5b1343107de1d937134220385ccc3f346fc88445ffd0ab"} Dec 01 15:25:08 crc kubenswrapper[4931]: I1201 15:25:08.254410 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a675ebc0-8c3b-4c43-884f-b32bd954ac6e" path="/var/lib/kubelet/pods/a675ebc0-8c3b-4c43-884f-b32bd954ac6e/volumes" Dec 01 15:25:08 crc kubenswrapper[4931]: I1201 15:25:08.856278 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc2ff309-c81a-4d19-bfb0-99a4a975b70a","Type":"ContainerStarted","Data":"4d4ad4f8d08630db2910af3c526de297b58aff077ff02eb1850a43ee0e5f0b94"} Dec 01 15:25:09 crc kubenswrapper[4931]: I1201 15:25:09.776733 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-nlgmq"] Dec 01 15:25:09 crc kubenswrapper[4931]: I1201 15:25:09.778889 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:09 crc kubenswrapper[4931]: I1201 15:25:09.784240 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 01 15:25:09 crc kubenswrapper[4931]: I1201 15:25:09.802830 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-nlgmq"] Dec 01 15:25:09 crc kubenswrapper[4931]: I1201 15:25:09.869601 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b364f6e4-552e-435c-b684-d6ebbc851ef2","Type":"ContainerStarted","Data":"a9bc5bcecf558f2b711708b3cd630fcce70742893a93c9bb1a838d1d9c7fb2ff"} Dec 01 15:25:09 crc kubenswrapper[4931]: I1201 15:25:09.926736 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:09 crc kubenswrapper[4931]: I1201 15:25:09.927019 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:09 crc kubenswrapper[4931]: I1201 15:25:09.927111 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:09 crc kubenswrapper[4931]: I1201 15:25:09.927281 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-dns-svc\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:09 crc kubenswrapper[4931]: I1201 15:25:09.927446 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-config\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:09 crc kubenswrapper[4931]: I1201 15:25:09.927554 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvfmb\" (UniqueName: \"kubernetes.io/projected/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-kube-api-access-cvfmb\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:09 crc kubenswrapper[4931]: I1201 15:25:09.927639 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.029066 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-dns-svc\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.029378 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-config\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.029575 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvfmb\" (UniqueName: \"kubernetes.io/projected/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-kube-api-access-cvfmb\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.029696 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.029850 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.029987 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.030136 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-dns-svc\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.030147 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.030922 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.031302 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-config\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.031798 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.032244 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.032838 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.060836 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvfmb\" (UniqueName: \"kubernetes.io/projected/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-kube-api-access-cvfmb\") pod \"dnsmasq-dns-5576978c7c-nlgmq\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.112344 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:10 crc kubenswrapper[4931]: W1201 15:25:10.586268 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39d70e78_caf7_4223_ba0d_e7b1aa6f5df4.slice/crio-b0283f7f8d1dd9dfaa79791c9dec4d45a21fcd9c6289e70b5e9292ff0438f044 WatchSource:0}: Error finding container b0283f7f8d1dd9dfaa79791c9dec4d45a21fcd9c6289e70b5e9292ff0438f044: Status 404 returned error can't find the container with id b0283f7f8d1dd9dfaa79791c9dec4d45a21fcd9c6289e70b5e9292ff0438f044 Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.587875 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-nlgmq"] Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.879897 4931 generic.go:334] "Generic (PLEG): container finished" podID="39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" containerID="8d37df6cf5bdb65abcec19d37df03b4413ec221420b8f0d0acb077bccb1f6626" exitCode=0 Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.879960 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" event={"ID":"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4","Type":"ContainerDied","Data":"8d37df6cf5bdb65abcec19d37df03b4413ec221420b8f0d0acb077bccb1f6626"} Dec 01 15:25:10 crc kubenswrapper[4931]: I1201 15:25:10.880441 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" event={"ID":"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4","Type":"ContainerStarted","Data":"b0283f7f8d1dd9dfaa79791c9dec4d45a21fcd9c6289e70b5e9292ff0438f044"} Dec 01 15:25:11 crc kubenswrapper[4931]: I1201 15:25:11.890971 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" event={"ID":"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4","Type":"ContainerStarted","Data":"f015028a0a7ad02dfa5e2fae242ceebdc77fb2ba1d3e4c40398a7b0bf9770a64"} Dec 01 15:25:11 crc kubenswrapper[4931]: I1201 15:25:11.891365 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:11 crc kubenswrapper[4931]: I1201 15:25:11.910248 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" podStartSLOduration=2.910228462 podStartE2EDuration="2.910228462s" podCreationTimestamp="2025-12-01 15:25:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:25:11.906458015 +0000 UTC m=+1458.332331692" watchObservedRunningTime="2025-12-01 15:25:11.910228462 +0000 UTC m=+1458.336102129" Dec 01 15:25:19 crc kubenswrapper[4931]: I1201 15:25:19.872651 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:25:19 crc kubenswrapper[4931]: I1201 15:25:19.873293 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:25:19 crc kubenswrapper[4931]: I1201 15:25:19.873340 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:25:19 crc kubenswrapper[4931]: I1201 15:25:19.874098 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58f06495cba8dbbb838e05fa0d374d0f9cc22d5fa8c965e16e9109a8373c2319"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 15:25:19 crc kubenswrapper[4931]: I1201 15:25:19.874149 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://58f06495cba8dbbb838e05fa0d374d0f9cc22d5fa8c965e16e9109a8373c2319" gracePeriod=600 Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.114558 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.174928 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-8tb7h"] Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.175206 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" podUID="00d0b21e-97af-42df-b2f0-f137b01e4112" containerName="dnsmasq-dns" containerID="cri-o://96cf3a26f8693aaae22e2ad7cea050e3f50650147bcb399871f8afb3ac42311e" gracePeriod=10 Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.343129 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-w6hcc"] Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.345460 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.362416 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-w6hcc"] Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.424523 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.424578 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.424666 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.424692 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.424748 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q7mm\" (UniqueName: \"kubernetes.io/projected/e71b5ae6-3b87-43c4-839b-350df6114a20-kube-api-access-4q7mm\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.424793 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-config\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.424817 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.526309 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.526358 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.526436 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q7mm\" (UniqueName: \"kubernetes.io/projected/e71b5ae6-3b87-43c4-839b-350df6114a20-kube-api-access-4q7mm\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.526524 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-config\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.526574 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.526610 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.526682 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.527111 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.527419 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.527628 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.527680 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.527690 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-config\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.527984 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e71b5ae6-3b87-43c4-839b-350df6114a20-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.557116 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q7mm\" (UniqueName: \"kubernetes.io/projected/e71b5ae6-3b87-43c4-839b-350df6114a20-kube-api-access-4q7mm\") pod \"dnsmasq-dns-8c6f6df99-w6hcc\" (UID: \"e71b5ae6-3b87-43c4-839b-350df6114a20\") " pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.661266 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.818509 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.943041 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-ovsdbserver-sb\") pod \"00d0b21e-97af-42df-b2f0-f137b01e4112\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.943457 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-ovsdbserver-nb\") pod \"00d0b21e-97af-42df-b2f0-f137b01e4112\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.943525 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-dns-svc\") pod \"00d0b21e-97af-42df-b2f0-f137b01e4112\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.943551 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-dns-swift-storage-0\") pod \"00d0b21e-97af-42df-b2f0-f137b01e4112\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.943609 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-config\") pod \"00d0b21e-97af-42df-b2f0-f137b01e4112\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.943644 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bk4n\" (UniqueName: \"kubernetes.io/projected/00d0b21e-97af-42df-b2f0-f137b01e4112-kube-api-access-2bk4n\") pod \"00d0b21e-97af-42df-b2f0-f137b01e4112\" (UID: \"00d0b21e-97af-42df-b2f0-f137b01e4112\") " Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.953682 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d0b21e-97af-42df-b2f0-f137b01e4112-kube-api-access-2bk4n" (OuterVolumeSpecName: "kube-api-access-2bk4n") pod "00d0b21e-97af-42df-b2f0-f137b01e4112" (UID: "00d0b21e-97af-42df-b2f0-f137b01e4112"). InnerVolumeSpecName "kube-api-access-2bk4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.975202 4931 generic.go:334] "Generic (PLEG): container finished" podID="00d0b21e-97af-42df-b2f0-f137b01e4112" containerID="96cf3a26f8693aaae22e2ad7cea050e3f50650147bcb399871f8afb3ac42311e" exitCode=0 Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.975257 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" event={"ID":"00d0b21e-97af-42df-b2f0-f137b01e4112","Type":"ContainerDied","Data":"96cf3a26f8693aaae22e2ad7cea050e3f50650147bcb399871f8afb3ac42311e"} Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.975309 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" event={"ID":"00d0b21e-97af-42df-b2f0-f137b01e4112","Type":"ContainerDied","Data":"edd30ac6a18164f82337bf9552e70b64a9849409e684337fe6c279520d0b9ce0"} Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.975336 4931 scope.go:117] "RemoveContainer" containerID="96cf3a26f8693aaae22e2ad7cea050e3f50650147bcb399871f8afb3ac42311e" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.975708 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-8tb7h" Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.980048 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="58f06495cba8dbbb838e05fa0d374d0f9cc22d5fa8c965e16e9109a8373c2319" exitCode=0 Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.980087 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"58f06495cba8dbbb838e05fa0d374d0f9cc22d5fa8c965e16e9109a8373c2319"} Dec 01 15:25:20 crc kubenswrapper[4931]: I1201 15:25:20.980132 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9"} Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.008742 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00d0b21e-97af-42df-b2f0-f137b01e4112" (UID: "00d0b21e-97af-42df-b2f0-f137b01e4112"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.010773 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00d0b21e-97af-42df-b2f0-f137b01e4112" (UID: "00d0b21e-97af-42df-b2f0-f137b01e4112"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.017288 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "00d0b21e-97af-42df-b2f0-f137b01e4112" (UID: "00d0b21e-97af-42df-b2f0-f137b01e4112"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.017434 4931 scope.go:117] "RemoveContainer" containerID="ab618bcdb344c41e7bd0c9f00c624a66bd34dcbe0d9b9179284ab2097166306e" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.028856 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00d0b21e-97af-42df-b2f0-f137b01e4112" (UID: "00d0b21e-97af-42df-b2f0-f137b01e4112"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.042674 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-config" (OuterVolumeSpecName: "config") pod "00d0b21e-97af-42df-b2f0-f137b01e4112" (UID: "00d0b21e-97af-42df-b2f0-f137b01e4112"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.046697 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.046726 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.046736 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.046746 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.046754 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d0b21e-97af-42df-b2f0-f137b01e4112-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.046765 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bk4n\" (UniqueName: \"kubernetes.io/projected/00d0b21e-97af-42df-b2f0-f137b01e4112-kube-api-access-2bk4n\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.112696 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-w6hcc"] Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.115764 4931 scope.go:117] "RemoveContainer" containerID="96cf3a26f8693aaae22e2ad7cea050e3f50650147bcb399871f8afb3ac42311e" Dec 01 15:25:21 crc kubenswrapper[4931]: E1201 15:25:21.116193 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96cf3a26f8693aaae22e2ad7cea050e3f50650147bcb399871f8afb3ac42311e\": container with ID starting with 96cf3a26f8693aaae22e2ad7cea050e3f50650147bcb399871f8afb3ac42311e not found: ID does not exist" containerID="96cf3a26f8693aaae22e2ad7cea050e3f50650147bcb399871f8afb3ac42311e" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.116222 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96cf3a26f8693aaae22e2ad7cea050e3f50650147bcb399871f8afb3ac42311e"} err="failed to get container status \"96cf3a26f8693aaae22e2ad7cea050e3f50650147bcb399871f8afb3ac42311e\": rpc error: code = NotFound desc = could not find container \"96cf3a26f8693aaae22e2ad7cea050e3f50650147bcb399871f8afb3ac42311e\": container with ID starting with 96cf3a26f8693aaae22e2ad7cea050e3f50650147bcb399871f8afb3ac42311e not found: ID does not exist" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.116241 4931 scope.go:117] "RemoveContainer" containerID="ab618bcdb344c41e7bd0c9f00c624a66bd34dcbe0d9b9179284ab2097166306e" Dec 01 15:25:21 crc kubenswrapper[4931]: E1201 15:25:21.116564 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab618bcdb344c41e7bd0c9f00c624a66bd34dcbe0d9b9179284ab2097166306e\": container with ID starting with ab618bcdb344c41e7bd0c9f00c624a66bd34dcbe0d9b9179284ab2097166306e not found: ID does not exist" containerID="ab618bcdb344c41e7bd0c9f00c624a66bd34dcbe0d9b9179284ab2097166306e" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.116614 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab618bcdb344c41e7bd0c9f00c624a66bd34dcbe0d9b9179284ab2097166306e"} err="failed to get container status \"ab618bcdb344c41e7bd0c9f00c624a66bd34dcbe0d9b9179284ab2097166306e\": rpc error: code = NotFound desc = could not find container \"ab618bcdb344c41e7bd0c9f00c624a66bd34dcbe0d9b9179284ab2097166306e\": container with ID starting with ab618bcdb344c41e7bd0c9f00c624a66bd34dcbe0d9b9179284ab2097166306e not found: ID does not exist" Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.116647 4931 scope.go:117] "RemoveContainer" containerID="57835c837fadcd2c88b5f726b0f5a7aef7db7caf224620b840275c3b23741956" Dec 01 15:25:21 crc kubenswrapper[4931]: W1201 15:25:21.121970 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode71b5ae6_3b87_43c4_839b_350df6114a20.slice/crio-76a3128662caccb6c5eb3d9342e1d26b2553fb03da5bd8ff7f6f5158ff31b6ed WatchSource:0}: Error finding container 76a3128662caccb6c5eb3d9342e1d26b2553fb03da5bd8ff7f6f5158ff31b6ed: Status 404 returned error can't find the container with id 76a3128662caccb6c5eb3d9342e1d26b2553fb03da5bd8ff7f6f5158ff31b6ed Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.324863 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-8tb7h"] Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.335927 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-8tb7h"] Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.995443 4931 generic.go:334] "Generic (PLEG): container finished" podID="e71b5ae6-3b87-43c4-839b-350df6114a20" containerID="1ea11824833864de7ddc42571b36384f55b425b97912c4463ea2ee3ebb15bd81" exitCode=0 Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.995747 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" event={"ID":"e71b5ae6-3b87-43c4-839b-350df6114a20","Type":"ContainerDied","Data":"1ea11824833864de7ddc42571b36384f55b425b97912c4463ea2ee3ebb15bd81"} Dec 01 15:25:21 crc kubenswrapper[4931]: I1201 15:25:21.995773 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" event={"ID":"e71b5ae6-3b87-43c4-839b-350df6114a20","Type":"ContainerStarted","Data":"76a3128662caccb6c5eb3d9342e1d26b2553fb03da5bd8ff7f6f5158ff31b6ed"} Dec 01 15:25:22 crc kubenswrapper[4931]: I1201 15:25:22.255139 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d0b21e-97af-42df-b2f0-f137b01e4112" path="/var/lib/kubelet/pods/00d0b21e-97af-42df-b2f0-f137b01e4112/volumes" Dec 01 15:25:23 crc kubenswrapper[4931]: I1201 15:25:23.015276 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" event={"ID":"e71b5ae6-3b87-43c4-839b-350df6114a20","Type":"ContainerStarted","Data":"3828fe2c6205d2c9fdfd923cd966aa9f8c52f14c9801f4a5a699a0bbdfa9ba3f"} Dec 01 15:25:23 crc kubenswrapper[4931]: I1201 15:25:23.016044 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:23 crc kubenswrapper[4931]: I1201 15:25:23.049743 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" podStartSLOduration=3.049725759 podStartE2EDuration="3.049725759s" podCreationTimestamp="2025-12-01 15:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:25:23.047876597 +0000 UTC m=+1469.473750264" watchObservedRunningTime="2025-12-01 15:25:23.049725759 +0000 UTC m=+1469.475599426" Dec 01 15:25:30 crc kubenswrapper[4931]: I1201 15:25:30.663479 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c6f6df99-w6hcc" Dec 01 15:25:30 crc kubenswrapper[4931]: I1201 15:25:30.742568 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-nlgmq"] Dec 01 15:25:30 crc kubenswrapper[4931]: I1201 15:25:30.742873 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" podUID="39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" containerName="dnsmasq-dns" containerID="cri-o://f015028a0a7ad02dfa5e2fae242ceebdc77fb2ba1d3e4c40398a7b0bf9770a64" gracePeriod=10 Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.084944 4931 generic.go:334] "Generic (PLEG): container finished" podID="39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" containerID="f015028a0a7ad02dfa5e2fae242ceebdc77fb2ba1d3e4c40398a7b0bf9770a64" exitCode=0 Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.085135 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" event={"ID":"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4","Type":"ContainerDied","Data":"f015028a0a7ad02dfa5e2fae242ceebdc77fb2ba1d3e4c40398a7b0bf9770a64"} Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.514874 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.654988 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-ovsdbserver-nb\") pod \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.655074 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvfmb\" (UniqueName: \"kubernetes.io/projected/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-kube-api-access-cvfmb\") pod \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.655227 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-config\") pod \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.655254 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-dns-svc\") pod \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.655299 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-dns-swift-storage-0\") pod \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.655461 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-ovsdbserver-sb\") pod \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.655528 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-openstack-edpm-ipam\") pod \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\" (UID: \"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4\") " Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.668740 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-kube-api-access-cvfmb" (OuterVolumeSpecName: "kube-api-access-cvfmb") pod "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" (UID: "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4"). InnerVolumeSpecName "kube-api-access-cvfmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.705058 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" (UID: "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.712368 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" (UID: "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.717988 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-config" (OuterVolumeSpecName: "config") pod "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" (UID: "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.722959 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" (UID: "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.724733 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" (UID: "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.733170 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" (UID: "39d70e78-caf7-4223-ba0d-e7b1aa6f5df4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.759258 4931 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.759302 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.759316 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.759330 4931 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.759341 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvfmb\" (UniqueName: \"kubernetes.io/projected/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-kube-api-access-cvfmb\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.759356 4931 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:31 crc kubenswrapper[4931]: I1201 15:25:31.759368 4931 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 15:25:32 crc kubenswrapper[4931]: I1201 15:25:32.098863 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" event={"ID":"39d70e78-caf7-4223-ba0d-e7b1aa6f5df4","Type":"ContainerDied","Data":"b0283f7f8d1dd9dfaa79791c9dec4d45a21fcd9c6289e70b5e9292ff0438f044"} Dec 01 15:25:32 crc kubenswrapper[4931]: I1201 15:25:32.098931 4931 scope.go:117] "RemoveContainer" containerID="f015028a0a7ad02dfa5e2fae242ceebdc77fb2ba1d3e4c40398a7b0bf9770a64" Dec 01 15:25:32 crc kubenswrapper[4931]: I1201 15:25:32.099125 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-nlgmq" Dec 01 15:25:32 crc kubenswrapper[4931]: I1201 15:25:32.125658 4931 scope.go:117] "RemoveContainer" containerID="8d37df6cf5bdb65abcec19d37df03b4413ec221420b8f0d0acb077bccb1f6626" Dec 01 15:25:32 crc kubenswrapper[4931]: I1201 15:25:32.161896 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-nlgmq"] Dec 01 15:25:32 crc kubenswrapper[4931]: I1201 15:25:32.172095 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-nlgmq"] Dec 01 15:25:32 crc kubenswrapper[4931]: I1201 15:25:32.254571 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" path="/var/lib/kubelet/pods/39d70e78-caf7-4223-ba0d-e7b1aa6f5df4/volumes" Dec 01 15:25:35 crc kubenswrapper[4931]: I1201 15:25:35.563227 4931 scope.go:117] "RemoveContainer" containerID="6bbf8e217e78cea3c06a545bdebf79654405c4d14fcb17a11c668f27189b5543" Dec 01 15:25:35 crc kubenswrapper[4931]: I1201 15:25:35.598369 4931 scope.go:117] "RemoveContainer" containerID="1e4ee656f788c135500ac63c83ccdd8364b006c0124aa97183cd21c2417e9f26" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.528011 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj"] Dec 01 15:25:39 crc kubenswrapper[4931]: E1201 15:25:39.529113 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" containerName="dnsmasq-dns" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.529131 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" containerName="dnsmasq-dns" Dec 01 15:25:39 crc kubenswrapper[4931]: E1201 15:25:39.529160 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d0b21e-97af-42df-b2f0-f137b01e4112" containerName="init" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.529168 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d0b21e-97af-42df-b2f0-f137b01e4112" containerName="init" Dec 01 15:25:39 crc kubenswrapper[4931]: E1201 15:25:39.529181 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" containerName="init" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.529189 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" containerName="init" Dec 01 15:25:39 crc kubenswrapper[4931]: E1201 15:25:39.529200 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d0b21e-97af-42df-b2f0-f137b01e4112" containerName="dnsmasq-dns" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.529207 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d0b21e-97af-42df-b2f0-f137b01e4112" containerName="dnsmasq-dns" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.529638 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="00d0b21e-97af-42df-b2f0-f137b01e4112" containerName="dnsmasq-dns" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.529671 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="39d70e78-caf7-4223-ba0d-e7b1aa6f5df4" containerName="dnsmasq-dns" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.530537 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.532882 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.533184 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.533302 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.535607 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.539909 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj"] Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.620194 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.620293 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.620333 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.620357 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6jtr\" (UniqueName: \"kubernetes.io/projected/c44015a7-22fa-4746-8536-2f7c70888a5d-kube-api-access-g6jtr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.722045 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.722089 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6jtr\" (UniqueName: \"kubernetes.io/projected/c44015a7-22fa-4746-8536-2f7c70888a5d-kube-api-access-g6jtr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.722182 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.722257 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.728250 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.728502 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.729507 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.739762 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6jtr\" (UniqueName: \"kubernetes.io/projected/c44015a7-22fa-4746-8536-2f7c70888a5d-kube-api-access-g6jtr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:39 crc kubenswrapper[4931]: I1201 15:25:39.849811 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:25:40 crc kubenswrapper[4931]: I1201 15:25:40.430738 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj"] Dec 01 15:25:41 crc kubenswrapper[4931]: I1201 15:25:41.187357 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" event={"ID":"c44015a7-22fa-4746-8536-2f7c70888a5d","Type":"ContainerStarted","Data":"5ad9aa0d7a5cef8b7de530944664cf49acb5a682ab63d69f4164b6f019121c7b"} Dec 01 15:25:41 crc kubenswrapper[4931]: I1201 15:25:41.188942 4931 generic.go:334] "Generic (PLEG): container finished" podID="bc2ff309-c81a-4d19-bfb0-99a4a975b70a" containerID="4d4ad4f8d08630db2910af3c526de297b58aff077ff02eb1850a43ee0e5f0b94" exitCode=0 Dec 01 15:25:41 crc kubenswrapper[4931]: I1201 15:25:41.188976 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc2ff309-c81a-4d19-bfb0-99a4a975b70a","Type":"ContainerDied","Data":"4d4ad4f8d08630db2910af3c526de297b58aff077ff02eb1850a43ee0e5f0b94"} Dec 01 15:25:42 crc kubenswrapper[4931]: I1201 15:25:42.203765 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bc2ff309-c81a-4d19-bfb0-99a4a975b70a","Type":"ContainerStarted","Data":"9f3c43ab5f4f6d45c977b351aaad1bc863093feec871865ddc2d1e42663fe75e"} Dec 01 15:25:42 crc kubenswrapper[4931]: I1201 15:25:42.205679 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 15:25:42 crc kubenswrapper[4931]: I1201 15:25:42.209118 4931 generic.go:334] "Generic (PLEG): container finished" podID="b364f6e4-552e-435c-b684-d6ebbc851ef2" containerID="a9bc5bcecf558f2b711708b3cd630fcce70742893a93c9bb1a838d1d9c7fb2ff" exitCode=0 Dec 01 15:25:42 crc kubenswrapper[4931]: I1201 15:25:42.209157 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b364f6e4-552e-435c-b684-d6ebbc851ef2","Type":"ContainerDied","Data":"a9bc5bcecf558f2b711708b3cd630fcce70742893a93c9bb1a838d1d9c7fb2ff"} Dec 01 15:25:42 crc kubenswrapper[4931]: I1201 15:25:42.241556 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.241539067 podStartE2EDuration="37.241539067s" podCreationTimestamp="2025-12-01 15:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:25:42.234480857 +0000 UTC m=+1488.660354534" watchObservedRunningTime="2025-12-01 15:25:42.241539067 +0000 UTC m=+1488.667412734" Dec 01 15:25:43 crc kubenswrapper[4931]: I1201 15:25:43.224253 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b364f6e4-552e-435c-b684-d6ebbc851ef2","Type":"ContainerStarted","Data":"6c9eba882ae8673fcc55405dc0806a9b61f9b851a5b3f93ff4bbb2161532016c"} Dec 01 15:25:43 crc kubenswrapper[4931]: I1201 15:25:43.264413 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.264356807 podStartE2EDuration="37.264356807s" podCreationTimestamp="2025-12-01 15:25:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:25:43.255003581 +0000 UTC m=+1489.680877248" watchObservedRunningTime="2025-12-01 15:25:43.264356807 +0000 UTC m=+1489.690230474" Dec 01 15:25:47 crc kubenswrapper[4931]: I1201 15:25:47.266557 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:25:50 crc kubenswrapper[4931]: I1201 15:25:50.286451 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" event={"ID":"c44015a7-22fa-4746-8536-2f7c70888a5d","Type":"ContainerStarted","Data":"d94fd4b21666b3cde4534c4e083543e77e651a2dcf9ffcf949e5e6c37e972714"} Dec 01 15:25:50 crc kubenswrapper[4931]: I1201 15:25:50.306170 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" podStartSLOduration=2.128690962 podStartE2EDuration="11.306152079s" podCreationTimestamp="2025-12-01 15:25:39 +0000 UTC" firstStartedPulling="2025-12-01 15:25:40.431950991 +0000 UTC m=+1486.857824658" lastFinishedPulling="2025-12-01 15:25:49.609412108 +0000 UTC m=+1496.035285775" observedRunningTime="2025-12-01 15:25:50.303278648 +0000 UTC m=+1496.729152325" watchObservedRunningTime="2025-12-01 15:25:50.306152079 +0000 UTC m=+1496.732025756" Dec 01 15:25:56 crc kubenswrapper[4931]: I1201 15:25:56.259628 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 15:25:57 crc kubenswrapper[4931]: I1201 15:25:57.269739 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 15:26:02 crc kubenswrapper[4931]: I1201 15:26:02.414042 4931 generic.go:334] "Generic (PLEG): container finished" podID="c44015a7-22fa-4746-8536-2f7c70888a5d" containerID="d94fd4b21666b3cde4534c4e083543e77e651a2dcf9ffcf949e5e6c37e972714" exitCode=0 Dec 01 15:26:02 crc kubenswrapper[4931]: I1201 15:26:02.414122 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" event={"ID":"c44015a7-22fa-4746-8536-2f7c70888a5d","Type":"ContainerDied","Data":"d94fd4b21666b3cde4534c4e083543e77e651a2dcf9ffcf949e5e6c37e972714"} Dec 01 15:26:03 crc kubenswrapper[4931]: I1201 15:26:03.858352 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.026545 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-ssh-key\") pod \"c44015a7-22fa-4746-8536-2f7c70888a5d\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.026605 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-repo-setup-combined-ca-bundle\") pod \"c44015a7-22fa-4746-8536-2f7c70888a5d\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.026690 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-inventory\") pod \"c44015a7-22fa-4746-8536-2f7c70888a5d\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.026778 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6jtr\" (UniqueName: \"kubernetes.io/projected/c44015a7-22fa-4746-8536-2f7c70888a5d-kube-api-access-g6jtr\") pod \"c44015a7-22fa-4746-8536-2f7c70888a5d\" (UID: \"c44015a7-22fa-4746-8536-2f7c70888a5d\") " Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.032122 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c44015a7-22fa-4746-8536-2f7c70888a5d" (UID: "c44015a7-22fa-4746-8536-2f7c70888a5d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.032212 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c44015a7-22fa-4746-8536-2f7c70888a5d-kube-api-access-g6jtr" (OuterVolumeSpecName: "kube-api-access-g6jtr") pod "c44015a7-22fa-4746-8536-2f7c70888a5d" (UID: "c44015a7-22fa-4746-8536-2f7c70888a5d"). InnerVolumeSpecName "kube-api-access-g6jtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.054244 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c44015a7-22fa-4746-8536-2f7c70888a5d" (UID: "c44015a7-22fa-4746-8536-2f7c70888a5d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.058706 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-inventory" (OuterVolumeSpecName: "inventory") pod "c44015a7-22fa-4746-8536-2f7c70888a5d" (UID: "c44015a7-22fa-4746-8536-2f7c70888a5d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.129043 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.129078 4931 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.129091 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c44015a7-22fa-4746-8536-2f7c70888a5d-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.129100 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6jtr\" (UniqueName: \"kubernetes.io/projected/c44015a7-22fa-4746-8536-2f7c70888a5d-kube-api-access-g6jtr\") on node \"crc\" DevicePath \"\"" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.434793 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" event={"ID":"c44015a7-22fa-4746-8536-2f7c70888a5d","Type":"ContainerDied","Data":"5ad9aa0d7a5cef8b7de530944664cf49acb5a682ab63d69f4164b6f019121c7b"} Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.434838 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ad9aa0d7a5cef8b7de530944664cf49acb5a682ab63d69f4164b6f019121c7b" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.434860 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.525276 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv"] Dec 01 15:26:04 crc kubenswrapper[4931]: E1201 15:26:04.525921 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c44015a7-22fa-4746-8536-2f7c70888a5d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.525946 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c44015a7-22fa-4746-8536-2f7c70888a5d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.526302 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c44015a7-22fa-4746-8536-2f7c70888a5d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.527122 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.529093 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.529615 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.529876 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.530071 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.536725 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv"] Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.638580 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/000355db-6a0a-46ab-8e8e-1040c775de8a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wx9bv\" (UID: \"000355db-6a0a-46ab-8e8e-1040c775de8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.638648 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/000355db-6a0a-46ab-8e8e-1040c775de8a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wx9bv\" (UID: \"000355db-6a0a-46ab-8e8e-1040c775de8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.639076 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp2lh\" (UniqueName: \"kubernetes.io/projected/000355db-6a0a-46ab-8e8e-1040c775de8a-kube-api-access-fp2lh\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wx9bv\" (UID: \"000355db-6a0a-46ab-8e8e-1040c775de8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.741289 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/000355db-6a0a-46ab-8e8e-1040c775de8a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wx9bv\" (UID: \"000355db-6a0a-46ab-8e8e-1040c775de8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.741357 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/000355db-6a0a-46ab-8e8e-1040c775de8a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wx9bv\" (UID: \"000355db-6a0a-46ab-8e8e-1040c775de8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.741505 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp2lh\" (UniqueName: \"kubernetes.io/projected/000355db-6a0a-46ab-8e8e-1040c775de8a-kube-api-access-fp2lh\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wx9bv\" (UID: \"000355db-6a0a-46ab-8e8e-1040c775de8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.744924 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/000355db-6a0a-46ab-8e8e-1040c775de8a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wx9bv\" (UID: \"000355db-6a0a-46ab-8e8e-1040c775de8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.746956 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/000355db-6a0a-46ab-8e8e-1040c775de8a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wx9bv\" (UID: \"000355db-6a0a-46ab-8e8e-1040c775de8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.766626 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp2lh\" (UniqueName: \"kubernetes.io/projected/000355db-6a0a-46ab-8e8e-1040c775de8a-kube-api-access-fp2lh\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wx9bv\" (UID: \"000355db-6a0a-46ab-8e8e-1040c775de8a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" Dec 01 15:26:04 crc kubenswrapper[4931]: I1201 15:26:04.848978 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" Dec 01 15:26:05 crc kubenswrapper[4931]: I1201 15:26:05.353733 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv"] Dec 01 15:26:05 crc kubenswrapper[4931]: W1201 15:26:05.358779 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod000355db_6a0a_46ab_8e8e_1040c775de8a.slice/crio-69eaf75b489d9715782db8d8e212d97975953b5f7b66304a180a62b131b7d0cc WatchSource:0}: Error finding container 69eaf75b489d9715782db8d8e212d97975953b5f7b66304a180a62b131b7d0cc: Status 404 returned error can't find the container with id 69eaf75b489d9715782db8d8e212d97975953b5f7b66304a180a62b131b7d0cc Dec 01 15:26:05 crc kubenswrapper[4931]: I1201 15:26:05.445262 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" event={"ID":"000355db-6a0a-46ab-8e8e-1040c775de8a","Type":"ContainerStarted","Data":"69eaf75b489d9715782db8d8e212d97975953b5f7b66304a180a62b131b7d0cc"} Dec 01 15:26:05 crc kubenswrapper[4931]: I1201 15:26:05.974246 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bcckx"] Dec 01 15:26:05 crc kubenswrapper[4931]: I1201 15:26:05.977071 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:05 crc kubenswrapper[4931]: I1201 15:26:05.992332 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcckx"] Dec 01 15:26:06 crc kubenswrapper[4931]: I1201 15:26:06.171489 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-utilities\") pod \"community-operators-bcckx\" (UID: \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\") " pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:06 crc kubenswrapper[4931]: I1201 15:26:06.171534 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-catalog-content\") pod \"community-operators-bcckx\" (UID: \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\") " pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:06 crc kubenswrapper[4931]: I1201 15:26:06.171593 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfvm2\" (UniqueName: \"kubernetes.io/projected/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-kube-api-access-wfvm2\") pod \"community-operators-bcckx\" (UID: \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\") " pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:06 crc kubenswrapper[4931]: I1201 15:26:06.273839 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-utilities\") pod \"community-operators-bcckx\" (UID: \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\") " pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:06 crc kubenswrapper[4931]: I1201 15:26:06.273878 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-catalog-content\") pod \"community-operators-bcckx\" (UID: \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\") " pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:06 crc kubenswrapper[4931]: I1201 15:26:06.274530 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-catalog-content\") pod \"community-operators-bcckx\" (UID: \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\") " pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:06 crc kubenswrapper[4931]: I1201 15:26:06.274831 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-utilities\") pod \"community-operators-bcckx\" (UID: \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\") " pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:06 crc kubenswrapper[4931]: I1201 15:26:06.275096 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfvm2\" (UniqueName: \"kubernetes.io/projected/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-kube-api-access-wfvm2\") pod \"community-operators-bcckx\" (UID: \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\") " pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:06 crc kubenswrapper[4931]: I1201 15:26:06.295229 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfvm2\" (UniqueName: \"kubernetes.io/projected/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-kube-api-access-wfvm2\") pod \"community-operators-bcckx\" (UID: \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\") " pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:06 crc kubenswrapper[4931]: I1201 15:26:06.310248 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:06 crc kubenswrapper[4931]: I1201 15:26:06.466371 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" event={"ID":"000355db-6a0a-46ab-8e8e-1040c775de8a","Type":"ContainerStarted","Data":"bb02d148e8aec1126fa9ab6ea51e4d4e7a3a61316673a364b62c9eeac9a26b7c"} Dec 01 15:26:06 crc kubenswrapper[4931]: I1201 15:26:06.492679 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" podStartSLOduration=2.053543769 podStartE2EDuration="2.492659098s" podCreationTimestamp="2025-12-01 15:26:04 +0000 UTC" firstStartedPulling="2025-12-01 15:26:05.361094908 +0000 UTC m=+1511.786968575" lastFinishedPulling="2025-12-01 15:26:05.800210237 +0000 UTC m=+1512.226083904" observedRunningTime="2025-12-01 15:26:06.484750273 +0000 UTC m=+1512.910623940" watchObservedRunningTime="2025-12-01 15:26:06.492659098 +0000 UTC m=+1512.918532785" Dec 01 15:26:06 crc kubenswrapper[4931]: I1201 15:26:06.844021 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcckx"] Dec 01 15:26:06 crc kubenswrapper[4931]: W1201 15:26:06.852132 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16fcf7cb_6d10_4298_9884_2ad7c0837fa6.slice/crio-2ed5434a5e7fea0e890ee5cea1f391ca72debb0863e54f9badc5ba5cd60d0514 WatchSource:0}: Error finding container 2ed5434a5e7fea0e890ee5cea1f391ca72debb0863e54f9badc5ba5cd60d0514: Status 404 returned error can't find the container with id 2ed5434a5e7fea0e890ee5cea1f391ca72debb0863e54f9badc5ba5cd60d0514 Dec 01 15:26:07 crc kubenswrapper[4931]: I1201 15:26:07.478264 4931 generic.go:334] "Generic (PLEG): container finished" podID="16fcf7cb-6d10-4298-9884-2ad7c0837fa6" containerID="09a556f7f88d101033884e665fd170496a79993d0d67a85106fa97301b63020b" exitCode=0 Dec 01 15:26:07 crc kubenswrapper[4931]: I1201 15:26:07.478340 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcckx" event={"ID":"16fcf7cb-6d10-4298-9884-2ad7c0837fa6","Type":"ContainerDied","Data":"09a556f7f88d101033884e665fd170496a79993d0d67a85106fa97301b63020b"} Dec 01 15:26:07 crc kubenswrapper[4931]: I1201 15:26:07.478617 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcckx" event={"ID":"16fcf7cb-6d10-4298-9884-2ad7c0837fa6","Type":"ContainerStarted","Data":"2ed5434a5e7fea0e890ee5cea1f391ca72debb0863e54f9badc5ba5cd60d0514"} Dec 01 15:26:09 crc kubenswrapper[4931]: I1201 15:26:09.497278 4931 generic.go:334] "Generic (PLEG): container finished" podID="000355db-6a0a-46ab-8e8e-1040c775de8a" containerID="bb02d148e8aec1126fa9ab6ea51e4d4e7a3a61316673a364b62c9eeac9a26b7c" exitCode=0 Dec 01 15:26:09 crc kubenswrapper[4931]: I1201 15:26:09.497355 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" event={"ID":"000355db-6a0a-46ab-8e8e-1040c775de8a","Type":"ContainerDied","Data":"bb02d148e8aec1126fa9ab6ea51e4d4e7a3a61316673a364b62c9eeac9a26b7c"} Dec 01 15:26:09 crc kubenswrapper[4931]: I1201 15:26:09.500802 4931 generic.go:334] "Generic (PLEG): container finished" podID="16fcf7cb-6d10-4298-9884-2ad7c0837fa6" containerID="efdff8afa9a70c57f16de9255e2071f5e0cf666867a6277ce31fbc1b2d53db1f" exitCode=0 Dec 01 15:26:09 crc kubenswrapper[4931]: I1201 15:26:09.500851 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcckx" event={"ID":"16fcf7cb-6d10-4298-9884-2ad7c0837fa6","Type":"ContainerDied","Data":"efdff8afa9a70c57f16de9255e2071f5e0cf666867a6277ce31fbc1b2d53db1f"} Dec 01 15:26:10 crc kubenswrapper[4931]: I1201 15:26:10.912182 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" Dec 01 15:26:10 crc kubenswrapper[4931]: I1201 15:26:10.955734 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hbtsz"] Dec 01 15:26:10 crc kubenswrapper[4931]: E1201 15:26:10.956171 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000355db-6a0a-46ab-8e8e-1040c775de8a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 15:26:10 crc kubenswrapper[4931]: I1201 15:26:10.956190 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="000355db-6a0a-46ab-8e8e-1040c775de8a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 15:26:10 crc kubenswrapper[4931]: I1201 15:26:10.956431 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="000355db-6a0a-46ab-8e8e-1040c775de8a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 15:26:10 crc kubenswrapper[4931]: I1201 15:26:10.958172 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:10 crc kubenswrapper[4931]: I1201 15:26:10.969297 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp2lh\" (UniqueName: \"kubernetes.io/projected/000355db-6a0a-46ab-8e8e-1040c775de8a-kube-api-access-fp2lh\") pod \"000355db-6a0a-46ab-8e8e-1040c775de8a\" (UID: \"000355db-6a0a-46ab-8e8e-1040c775de8a\") " Dec 01 15:26:10 crc kubenswrapper[4931]: I1201 15:26:10.969419 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/000355db-6a0a-46ab-8e8e-1040c775de8a-inventory\") pod \"000355db-6a0a-46ab-8e8e-1040c775de8a\" (UID: \"000355db-6a0a-46ab-8e8e-1040c775de8a\") " Dec 01 15:26:10 crc kubenswrapper[4931]: I1201 15:26:10.969598 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/000355db-6a0a-46ab-8e8e-1040c775de8a-ssh-key\") pod \"000355db-6a0a-46ab-8e8e-1040c775de8a\" (UID: \"000355db-6a0a-46ab-8e8e-1040c775de8a\") " Dec 01 15:26:10 crc kubenswrapper[4931]: I1201 15:26:10.972237 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv97l\" (UniqueName: \"kubernetes.io/projected/71619209-8442-4ad2-827f-39148126421e-kube-api-access-mv97l\") pod \"redhat-marketplace-hbtsz\" (UID: \"71619209-8442-4ad2-827f-39148126421e\") " pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:10 crc kubenswrapper[4931]: I1201 15:26:10.972397 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71619209-8442-4ad2-827f-39148126421e-utilities\") pod \"redhat-marketplace-hbtsz\" (UID: \"71619209-8442-4ad2-827f-39148126421e\") " pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:10 crc kubenswrapper[4931]: I1201 15:26:10.972509 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71619209-8442-4ad2-827f-39148126421e-catalog-content\") pod \"redhat-marketplace-hbtsz\" (UID: \"71619209-8442-4ad2-827f-39148126421e\") " pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:10 crc kubenswrapper[4931]: I1201 15:26:10.974922 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbtsz"] Dec 01 15:26:10 crc kubenswrapper[4931]: I1201 15:26:10.981593 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/000355db-6a0a-46ab-8e8e-1040c775de8a-kube-api-access-fp2lh" (OuterVolumeSpecName: "kube-api-access-fp2lh") pod "000355db-6a0a-46ab-8e8e-1040c775de8a" (UID: "000355db-6a0a-46ab-8e8e-1040c775de8a"). InnerVolumeSpecName "kube-api-access-fp2lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.008544 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000355db-6a0a-46ab-8e8e-1040c775de8a-inventory" (OuterVolumeSpecName: "inventory") pod "000355db-6a0a-46ab-8e8e-1040c775de8a" (UID: "000355db-6a0a-46ab-8e8e-1040c775de8a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.018475 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000355db-6a0a-46ab-8e8e-1040c775de8a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "000355db-6a0a-46ab-8e8e-1040c775de8a" (UID: "000355db-6a0a-46ab-8e8e-1040c775de8a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.074041 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71619209-8442-4ad2-827f-39148126421e-utilities\") pod \"redhat-marketplace-hbtsz\" (UID: \"71619209-8442-4ad2-827f-39148126421e\") " pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.074248 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71619209-8442-4ad2-827f-39148126421e-catalog-content\") pod \"redhat-marketplace-hbtsz\" (UID: \"71619209-8442-4ad2-827f-39148126421e\") " pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.074375 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv97l\" (UniqueName: \"kubernetes.io/projected/71619209-8442-4ad2-827f-39148126421e-kube-api-access-mv97l\") pod \"redhat-marketplace-hbtsz\" (UID: \"71619209-8442-4ad2-827f-39148126421e\") " pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.074563 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/000355db-6a0a-46ab-8e8e-1040c775de8a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.074616 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp2lh\" (UniqueName: \"kubernetes.io/projected/000355db-6a0a-46ab-8e8e-1040c775de8a-kube-api-access-fp2lh\") on node \"crc\" DevicePath \"\"" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.074668 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/000355db-6a0a-46ab-8e8e-1040c775de8a-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.074630 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71619209-8442-4ad2-827f-39148126421e-utilities\") pod \"redhat-marketplace-hbtsz\" (UID: \"71619209-8442-4ad2-827f-39148126421e\") " pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.074851 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71619209-8442-4ad2-827f-39148126421e-catalog-content\") pod \"redhat-marketplace-hbtsz\" (UID: \"71619209-8442-4ad2-827f-39148126421e\") " pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.094881 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv97l\" (UniqueName: \"kubernetes.io/projected/71619209-8442-4ad2-827f-39148126421e-kube-api-access-mv97l\") pod \"redhat-marketplace-hbtsz\" (UID: \"71619209-8442-4ad2-827f-39148126421e\") " pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.372251 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.527783 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcckx" event={"ID":"16fcf7cb-6d10-4298-9884-2ad7c0837fa6","Type":"ContainerStarted","Data":"9074c67ca72e31ea46819b0ab50be650b31a0aac19f201a087b244147fae1c7b"} Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.535584 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" event={"ID":"000355db-6a0a-46ab-8e8e-1040c775de8a","Type":"ContainerDied","Data":"69eaf75b489d9715782db8d8e212d97975953b5f7b66304a180a62b131b7d0cc"} Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.535820 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69eaf75b489d9715782db8d8e212d97975953b5f7b66304a180a62b131b7d0cc" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.535907 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wx9bv" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.566528 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bcckx" podStartSLOduration=3.648479762 podStartE2EDuration="6.566510132s" podCreationTimestamp="2025-12-01 15:26:05 +0000 UTC" firstStartedPulling="2025-12-01 15:26:07.481592237 +0000 UTC m=+1513.907465924" lastFinishedPulling="2025-12-01 15:26:10.399622627 +0000 UTC m=+1516.825496294" observedRunningTime="2025-12-01 15:26:11.560049328 +0000 UTC m=+1517.985923025" watchObservedRunningTime="2025-12-01 15:26:11.566510132 +0000 UTC m=+1517.992383799" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.597566 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn"] Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.598865 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.602879 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.603166 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.603313 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.603444 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.624312 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn"] Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.695736 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.695971 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.696232 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kghhm\" (UniqueName: \"kubernetes.io/projected/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-kube-api-access-kghhm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.696484 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.798084 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kghhm\" (UniqueName: \"kubernetes.io/projected/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-kube-api-access-kghhm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.798163 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.798231 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.798322 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.805955 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.808197 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.809982 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.821967 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kghhm\" (UniqueName: \"kubernetes.io/projected/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-kube-api-access-kghhm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.908126 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbtsz"] Dec 01 15:26:11 crc kubenswrapper[4931]: I1201 15:26:11.935842 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:26:12 crc kubenswrapper[4931]: I1201 15:26:12.498983 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn"] Dec 01 15:26:12 crc kubenswrapper[4931]: W1201 15:26:12.505436 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5d464d5_ddf8_4b7f_b1fb_7d65c5edd6f4.slice/crio-09922afcd78dfb44920c2d0ddcb2ce966dc7a1693a72199792f1a0b7191a65ed WatchSource:0}: Error finding container 09922afcd78dfb44920c2d0ddcb2ce966dc7a1693a72199792f1a0b7191a65ed: Status 404 returned error can't find the container with id 09922afcd78dfb44920c2d0ddcb2ce966dc7a1693a72199792f1a0b7191a65ed Dec 01 15:26:12 crc kubenswrapper[4931]: I1201 15:26:12.553741 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" event={"ID":"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4","Type":"ContainerStarted","Data":"09922afcd78dfb44920c2d0ddcb2ce966dc7a1693a72199792f1a0b7191a65ed"} Dec 01 15:26:12 crc kubenswrapper[4931]: I1201 15:26:12.558655 4931 generic.go:334] "Generic (PLEG): container finished" podID="71619209-8442-4ad2-827f-39148126421e" containerID="efd0fbd715944878f42fa752e4703c60d7935541b9bf9ff5956bcf760b28fb98" exitCode=0 Dec 01 15:26:12 crc kubenswrapper[4931]: I1201 15:26:12.558778 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbtsz" event={"ID":"71619209-8442-4ad2-827f-39148126421e","Type":"ContainerDied","Data":"efd0fbd715944878f42fa752e4703c60d7935541b9bf9ff5956bcf760b28fb98"} Dec 01 15:26:12 crc kubenswrapper[4931]: I1201 15:26:12.558869 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbtsz" event={"ID":"71619209-8442-4ad2-827f-39148126421e","Type":"ContainerStarted","Data":"6ba9cd85c84fca7c53776a13f20993c9a6ecea418c20560df4a3fcf7b0b3d3e6"} Dec 01 15:26:13 crc kubenswrapper[4931]: I1201 15:26:13.571164 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" event={"ID":"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4","Type":"ContainerStarted","Data":"0ab57272aae2ec59fcc559e7e38ed52c4093fee8cd75a12358b89395210ab3e3"} Dec 01 15:26:13 crc kubenswrapper[4931]: I1201 15:26:13.598207 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" podStartSLOduration=2.022267969 podStartE2EDuration="2.598184703s" podCreationTimestamp="2025-12-01 15:26:11 +0000 UTC" firstStartedPulling="2025-12-01 15:26:12.508890766 +0000 UTC m=+1518.934764433" lastFinishedPulling="2025-12-01 15:26:13.0848075 +0000 UTC m=+1519.510681167" observedRunningTime="2025-12-01 15:26:13.588210009 +0000 UTC m=+1520.014083696" watchObservedRunningTime="2025-12-01 15:26:13.598184703 +0000 UTC m=+1520.024058370" Dec 01 15:26:14 crc kubenswrapper[4931]: I1201 15:26:14.582847 4931 generic.go:334] "Generic (PLEG): container finished" podID="71619209-8442-4ad2-827f-39148126421e" containerID="31592dab405e06f3a6d3afe140f492b10a0efc822f01faa36a8ffaf31da88ac4" exitCode=0 Dec 01 15:26:14 crc kubenswrapper[4931]: I1201 15:26:14.583053 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbtsz" event={"ID":"71619209-8442-4ad2-827f-39148126421e","Type":"ContainerDied","Data":"31592dab405e06f3a6d3afe140f492b10a0efc822f01faa36a8ffaf31da88ac4"} Dec 01 15:26:16 crc kubenswrapper[4931]: I1201 15:26:16.311203 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:16 crc kubenswrapper[4931]: I1201 15:26:16.311518 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:16 crc kubenswrapper[4931]: I1201 15:26:16.363171 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:16 crc kubenswrapper[4931]: I1201 15:26:16.659377 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:17 crc kubenswrapper[4931]: I1201 15:26:17.542183 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcckx"] Dec 01 15:26:18 crc kubenswrapper[4931]: I1201 15:26:18.619435 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bcckx" podUID="16fcf7cb-6d10-4298-9884-2ad7c0837fa6" containerName="registry-server" containerID="cri-o://9074c67ca72e31ea46819b0ab50be650b31a0aac19f201a087b244147fae1c7b" gracePeriod=2 Dec 01 15:26:19 crc kubenswrapper[4931]: I1201 15:26:19.631500 4931 generic.go:334] "Generic (PLEG): container finished" podID="16fcf7cb-6d10-4298-9884-2ad7c0837fa6" containerID="9074c67ca72e31ea46819b0ab50be650b31a0aac19f201a087b244147fae1c7b" exitCode=0 Dec 01 15:26:19 crc kubenswrapper[4931]: I1201 15:26:19.631556 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcckx" event={"ID":"16fcf7cb-6d10-4298-9884-2ad7c0837fa6","Type":"ContainerDied","Data":"9074c67ca72e31ea46819b0ab50be650b31a0aac19f201a087b244147fae1c7b"} Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.526814 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.610741 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-utilities\") pod \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\" (UID: \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\") " Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.610841 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-catalog-content\") pod \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\" (UID: \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\") " Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.610944 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfvm2\" (UniqueName: \"kubernetes.io/projected/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-kube-api-access-wfvm2\") pod \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\" (UID: \"16fcf7cb-6d10-4298-9884-2ad7c0837fa6\") " Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.612049 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-utilities" (OuterVolumeSpecName: "utilities") pod "16fcf7cb-6d10-4298-9884-2ad7c0837fa6" (UID: "16fcf7cb-6d10-4298-9884-2ad7c0837fa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.620019 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-kube-api-access-wfvm2" (OuterVolumeSpecName: "kube-api-access-wfvm2") pod "16fcf7cb-6d10-4298-9884-2ad7c0837fa6" (UID: "16fcf7cb-6d10-4298-9884-2ad7c0837fa6"). InnerVolumeSpecName "kube-api-access-wfvm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.649051 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcckx" event={"ID":"16fcf7cb-6d10-4298-9884-2ad7c0837fa6","Type":"ContainerDied","Data":"2ed5434a5e7fea0e890ee5cea1f391ca72debb0863e54f9badc5ba5cd60d0514"} Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.649108 4931 scope.go:117] "RemoveContainer" containerID="9074c67ca72e31ea46819b0ab50be650b31a0aac19f201a087b244147fae1c7b" Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.649116 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcckx" Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.670965 4931 scope.go:117] "RemoveContainer" containerID="efdff8afa9a70c57f16de9255e2071f5e0cf666867a6277ce31fbc1b2d53db1f" Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.685281 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16fcf7cb-6d10-4298-9884-2ad7c0837fa6" (UID: "16fcf7cb-6d10-4298-9884-2ad7c0837fa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.693191 4931 scope.go:117] "RemoveContainer" containerID="09a556f7f88d101033884e665fd170496a79993d0d67a85106fa97301b63020b" Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.712576 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfvm2\" (UniqueName: \"kubernetes.io/projected/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-kube-api-access-wfvm2\") on node \"crc\" DevicePath \"\"" Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.712615 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.712629 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16fcf7cb-6d10-4298-9884-2ad7c0837fa6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.989477 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcckx"] Dec 01 15:26:20 crc kubenswrapper[4931]: I1201 15:26:20.999729 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bcckx"] Dec 01 15:26:21 crc kubenswrapper[4931]: E1201 15:26:21.141053 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16fcf7cb_6d10_4298_9884_2ad7c0837fa6.slice\": RecentStats: unable to find data in memory cache]" Dec 01 15:26:21 crc kubenswrapper[4931]: I1201 15:26:21.660670 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbtsz" event={"ID":"71619209-8442-4ad2-827f-39148126421e","Type":"ContainerStarted","Data":"dfd83b7987253e4681798032ce7334d3561da9fe416603ef3705f8e2992b8127"} Dec 01 15:26:21 crc kubenswrapper[4931]: I1201 15:26:21.682345 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hbtsz" podStartSLOduration=3.863753622 podStartE2EDuration="11.682329974s" podCreationTimestamp="2025-12-01 15:26:10 +0000 UTC" firstStartedPulling="2025-12-01 15:26:12.563747618 +0000 UTC m=+1518.989621285" lastFinishedPulling="2025-12-01 15:26:20.38232395 +0000 UTC m=+1526.808197637" observedRunningTime="2025-12-01 15:26:21.679084872 +0000 UTC m=+1528.104958539" watchObservedRunningTime="2025-12-01 15:26:21.682329974 +0000 UTC m=+1528.108203641" Dec 01 15:26:22 crc kubenswrapper[4931]: I1201 15:26:22.251883 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16fcf7cb-6d10-4298-9884-2ad7c0837fa6" path="/var/lib/kubelet/pods/16fcf7cb-6d10-4298-9884-2ad7c0837fa6/volumes" Dec 01 15:26:31 crc kubenswrapper[4931]: I1201 15:26:31.372820 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:31 crc kubenswrapper[4931]: I1201 15:26:31.374084 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:31 crc kubenswrapper[4931]: I1201 15:26:31.433718 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:31 crc kubenswrapper[4931]: I1201 15:26:31.820168 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:31 crc kubenswrapper[4931]: I1201 15:26:31.872887 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbtsz"] Dec 01 15:26:33 crc kubenswrapper[4931]: I1201 15:26:33.791085 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hbtsz" podUID="71619209-8442-4ad2-827f-39148126421e" containerName="registry-server" containerID="cri-o://dfd83b7987253e4681798032ce7334d3561da9fe416603ef3705f8e2992b8127" gracePeriod=2 Dec 01 15:26:34 crc kubenswrapper[4931]: I1201 15:26:34.805422 4931 generic.go:334] "Generic (PLEG): container finished" podID="71619209-8442-4ad2-827f-39148126421e" containerID="dfd83b7987253e4681798032ce7334d3561da9fe416603ef3705f8e2992b8127" exitCode=0 Dec 01 15:26:34 crc kubenswrapper[4931]: I1201 15:26:34.805530 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbtsz" event={"ID":"71619209-8442-4ad2-827f-39148126421e","Type":"ContainerDied","Data":"dfd83b7987253e4681798032ce7334d3561da9fe416603ef3705f8e2992b8127"} Dec 01 15:26:34 crc kubenswrapper[4931]: I1201 15:26:34.805806 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbtsz" event={"ID":"71619209-8442-4ad2-827f-39148126421e","Type":"ContainerDied","Data":"6ba9cd85c84fca7c53776a13f20993c9a6ecea418c20560df4a3fcf7b0b3d3e6"} Dec 01 15:26:34 crc kubenswrapper[4931]: I1201 15:26:34.805823 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ba9cd85c84fca7c53776a13f20993c9a6ecea418c20560df4a3fcf7b0b3d3e6" Dec 01 15:26:34 crc kubenswrapper[4931]: I1201 15:26:34.879246 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:34 crc kubenswrapper[4931]: I1201 15:26:34.969270 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71619209-8442-4ad2-827f-39148126421e-catalog-content\") pod \"71619209-8442-4ad2-827f-39148126421e\" (UID: \"71619209-8442-4ad2-827f-39148126421e\") " Dec 01 15:26:34 crc kubenswrapper[4931]: I1201 15:26:34.969375 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv97l\" (UniqueName: \"kubernetes.io/projected/71619209-8442-4ad2-827f-39148126421e-kube-api-access-mv97l\") pod \"71619209-8442-4ad2-827f-39148126421e\" (UID: \"71619209-8442-4ad2-827f-39148126421e\") " Dec 01 15:26:34 crc kubenswrapper[4931]: I1201 15:26:34.969739 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71619209-8442-4ad2-827f-39148126421e-utilities\") pod \"71619209-8442-4ad2-827f-39148126421e\" (UID: \"71619209-8442-4ad2-827f-39148126421e\") " Dec 01 15:26:34 crc kubenswrapper[4931]: I1201 15:26:34.970769 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71619209-8442-4ad2-827f-39148126421e-utilities" (OuterVolumeSpecName: "utilities") pod "71619209-8442-4ad2-827f-39148126421e" (UID: "71619209-8442-4ad2-827f-39148126421e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:26:34 crc kubenswrapper[4931]: I1201 15:26:34.975323 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71619209-8442-4ad2-827f-39148126421e-kube-api-access-mv97l" (OuterVolumeSpecName: "kube-api-access-mv97l") pod "71619209-8442-4ad2-827f-39148126421e" (UID: "71619209-8442-4ad2-827f-39148126421e"). InnerVolumeSpecName "kube-api-access-mv97l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:26:34 crc kubenswrapper[4931]: I1201 15:26:34.995849 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71619209-8442-4ad2-827f-39148126421e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71619209-8442-4ad2-827f-39148126421e" (UID: "71619209-8442-4ad2-827f-39148126421e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:26:35 crc kubenswrapper[4931]: I1201 15:26:35.074176 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71619209-8442-4ad2-827f-39148126421e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:26:35 crc kubenswrapper[4931]: I1201 15:26:35.074214 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv97l\" (UniqueName: \"kubernetes.io/projected/71619209-8442-4ad2-827f-39148126421e-kube-api-access-mv97l\") on node \"crc\" DevicePath \"\"" Dec 01 15:26:35 crc kubenswrapper[4931]: I1201 15:26:35.074314 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71619209-8442-4ad2-827f-39148126421e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:26:35 crc kubenswrapper[4931]: I1201 15:26:35.793321 4931 scope.go:117] "RemoveContainer" containerID="15f9cb36f6d653ce9e721db1bcf38fc6a4039d817385c223e9d7107ae75e02f9" Dec 01 15:26:35 crc kubenswrapper[4931]: I1201 15:26:35.818797 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hbtsz" Dec 01 15:26:35 crc kubenswrapper[4931]: I1201 15:26:35.824050 4931 scope.go:117] "RemoveContainer" containerID="f661bb563ead0bf260e8494ec73c509e32bae2f6ca65512ec8607f11dc2c0d22" Dec 01 15:26:35 crc kubenswrapper[4931]: I1201 15:26:35.864706 4931 scope.go:117] "RemoveContainer" containerID="6418cf119c9faa1901b2d0d27709caca64b33694788e403f87f151282873395c" Dec 01 15:26:35 crc kubenswrapper[4931]: I1201 15:26:35.926344 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbtsz"] Dec 01 15:26:35 crc kubenswrapper[4931]: I1201 15:26:35.951784 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbtsz"] Dec 01 15:26:36 crc kubenswrapper[4931]: I1201 15:26:36.269669 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71619209-8442-4ad2-827f-39148126421e" path="/var/lib/kubelet/pods/71619209-8442-4ad2-827f-39148126421e/volumes" Dec 01 15:27:35 crc kubenswrapper[4931]: I1201 15:27:35.981092 4931 scope.go:117] "RemoveContainer" containerID="f83a965d4d57c0812d20fe8b2533ce81c2d4b03b6b3136f4de5ddfd02deeaa1d" Dec 01 15:27:49 crc kubenswrapper[4931]: I1201 15:27:49.872584 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:27:49 crc kubenswrapper[4931]: I1201 15:27:49.873309 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:28:19 crc kubenswrapper[4931]: I1201 15:28:19.872487 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:28:19 crc kubenswrapper[4931]: I1201 15:28:19.872973 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:28:36 crc kubenswrapper[4931]: I1201 15:28:36.097029 4931 scope.go:117] "RemoveContainer" containerID="d3f93fc64e9553ec77673799f6285c6fbd88af0c65638bb2fb10982cee29ea3b" Dec 01 15:28:49 crc kubenswrapper[4931]: I1201 15:28:49.872202 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:28:49 crc kubenswrapper[4931]: I1201 15:28:49.872896 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:28:49 crc kubenswrapper[4931]: I1201 15:28:49.872967 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:28:49 crc kubenswrapper[4931]: I1201 15:28:49.874049 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 15:28:49 crc kubenswrapper[4931]: I1201 15:28:49.874156 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" gracePeriod=600 Dec 01 15:28:50 crc kubenswrapper[4931]: E1201 15:28:50.000096 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:28:50 crc kubenswrapper[4931]: I1201 15:28:50.121414 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" exitCode=0 Dec 01 15:28:50 crc kubenswrapper[4931]: I1201 15:28:50.121455 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9"} Dec 01 15:28:50 crc kubenswrapper[4931]: I1201 15:28:50.121485 4931 scope.go:117] "RemoveContainer" containerID="58f06495cba8dbbb838e05fa0d374d0f9cc22d5fa8c965e16e9109a8373c2319" Dec 01 15:28:50 crc kubenswrapper[4931]: I1201 15:28:50.122064 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:28:50 crc kubenswrapper[4931]: E1201 15:28:50.122442 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:28:54 crc kubenswrapper[4931]: E1201 15:28:54.955712 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Dec 01 15:29:03 crc kubenswrapper[4931]: I1201 15:29:03.241949 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:29:03 crc kubenswrapper[4931]: E1201 15:29:03.243895 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:29:08 crc kubenswrapper[4931]: I1201 15:29:08.045915 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-788c-account-create-update-fg8q4"] Dec 01 15:29:08 crc kubenswrapper[4931]: I1201 15:29:08.056440 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-65sr4"] Dec 01 15:29:08 crc kubenswrapper[4931]: I1201 15:29:08.065177 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-5gpcp"] Dec 01 15:29:08 crc kubenswrapper[4931]: I1201 15:29:08.073449 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-788c-account-create-update-fg8q4"] Dec 01 15:29:08 crc kubenswrapper[4931]: I1201 15:29:08.080892 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-5gpcp"] Dec 01 15:29:08 crc kubenswrapper[4931]: I1201 15:29:08.089004 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-65sr4"] Dec 01 15:29:08 crc kubenswrapper[4931]: I1201 15:29:08.255235 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c" path="/var/lib/kubelet/pods/27e0e4e7-9ebf-4fa2-89a1-455c8260bf3c/volumes" Dec 01 15:29:08 crc kubenswrapper[4931]: I1201 15:29:08.256124 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792a6a53-2fbf-4393-8464-f18ead2b290d" path="/var/lib/kubelet/pods/792a6a53-2fbf-4393-8464-f18ead2b290d/volumes" Dec 01 15:29:08 crc kubenswrapper[4931]: I1201 15:29:08.256679 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce0b7ec-9061-4965-8139-ebf2da3036be" path="/var/lib/kubelet/pods/bce0b7ec-9061-4965-8139-ebf2da3036be/volumes" Dec 01 15:29:09 crc kubenswrapper[4931]: I1201 15:29:09.028864 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-86ad-account-create-update-cjxmb"] Dec 01 15:29:09 crc kubenswrapper[4931]: I1201 15:29:09.038964 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-70d9-account-create-update-c6rqw"] Dec 01 15:29:09 crc kubenswrapper[4931]: I1201 15:29:09.049586 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-sr224"] Dec 01 15:29:09 crc kubenswrapper[4931]: I1201 15:29:09.057628 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-86ad-account-create-update-cjxmb"] Dec 01 15:29:09 crc kubenswrapper[4931]: I1201 15:29:09.067011 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-70d9-account-create-update-c6rqw"] Dec 01 15:29:09 crc kubenswrapper[4931]: I1201 15:29:09.076138 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-sr224"] Dec 01 15:29:10 crc kubenswrapper[4931]: I1201 15:29:10.252121 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9447449e-ba97-4ec6-b8c6-bdc2b54aa746" path="/var/lib/kubelet/pods/9447449e-ba97-4ec6-b8c6-bdc2b54aa746/volumes" Dec 01 15:29:10 crc kubenswrapper[4931]: I1201 15:29:10.252932 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a66dba-9924-4089-a3e5-5aa771f117b5" path="/var/lib/kubelet/pods/b7a66dba-9924-4089-a3e5-5aa771f117b5/volumes" Dec 01 15:29:10 crc kubenswrapper[4931]: I1201 15:29:10.253451 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8725b44-71db-414e-a092-a684774ccc44" path="/var/lib/kubelet/pods/c8725b44-71db-414e-a092-a684774ccc44/volumes" Dec 01 15:29:16 crc kubenswrapper[4931]: I1201 15:29:16.241822 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:29:16 crc kubenswrapper[4931]: E1201 15:29:16.242590 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:29:19 crc kubenswrapper[4931]: I1201 15:29:19.394263 4931 generic.go:334] "Generic (PLEG): container finished" podID="b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4" containerID="0ab57272aae2ec59fcc559e7e38ed52c4093fee8cd75a12358b89395210ab3e3" exitCode=0 Dec 01 15:29:19 crc kubenswrapper[4931]: I1201 15:29:19.394372 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" event={"ID":"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4","Type":"ContainerDied","Data":"0ab57272aae2ec59fcc559e7e38ed52c4093fee8cd75a12358b89395210ab3e3"} Dec 01 15:29:20 crc kubenswrapper[4931]: I1201 15:29:20.814317 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:29:20 crc kubenswrapper[4931]: I1201 15:29:20.892836 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-ssh-key\") pod \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " Dec 01 15:29:20 crc kubenswrapper[4931]: I1201 15:29:20.892881 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-bootstrap-combined-ca-bundle\") pod \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " Dec 01 15:29:20 crc kubenswrapper[4931]: I1201 15:29:20.892980 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-inventory\") pod \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " Dec 01 15:29:20 crc kubenswrapper[4931]: I1201 15:29:20.893034 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kghhm\" (UniqueName: \"kubernetes.io/projected/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-kube-api-access-kghhm\") pod \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\" (UID: \"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4\") " Dec 01 15:29:20 crc kubenswrapper[4931]: I1201 15:29:20.903495 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4" (UID: "b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:29:20 crc kubenswrapper[4931]: I1201 15:29:20.904511 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-kube-api-access-kghhm" (OuterVolumeSpecName: "kube-api-access-kghhm") pod "b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4" (UID: "b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4"). InnerVolumeSpecName "kube-api-access-kghhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:29:20 crc kubenswrapper[4931]: I1201 15:29:20.925196 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-inventory" (OuterVolumeSpecName: "inventory") pod "b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4" (UID: "b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:29:20 crc kubenswrapper[4931]: I1201 15:29:20.927554 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4" (UID: "b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:29:20 crc kubenswrapper[4931]: I1201 15:29:20.996079 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:29:20 crc kubenswrapper[4931]: I1201 15:29:20.996462 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kghhm\" (UniqueName: \"kubernetes.io/projected/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-kube-api-access-kghhm\") on node \"crc\" DevicePath \"\"" Dec 01 15:29:20 crc kubenswrapper[4931]: I1201 15:29:20.997245 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:29:20 crc kubenswrapper[4931]: I1201 15:29:20.997316 4931 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.412182 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" event={"ID":"b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4","Type":"ContainerDied","Data":"09922afcd78dfb44920c2d0ddcb2ce966dc7a1693a72199792f1a0b7191a65ed"} Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.412495 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09922afcd78dfb44920c2d0ddcb2ce966dc7a1693a72199792f1a0b7191a65ed" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.412246 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.495484 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d"] Dec 01 15:29:21 crc kubenswrapper[4931]: E1201 15:29:21.495850 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71619209-8442-4ad2-827f-39148126421e" containerName="extract-utilities" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.495867 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="71619209-8442-4ad2-827f-39148126421e" containerName="extract-utilities" Dec 01 15:29:21 crc kubenswrapper[4931]: E1201 15:29:21.495881 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.495891 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 15:29:21 crc kubenswrapper[4931]: E1201 15:29:21.495929 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fcf7cb-6d10-4298-9884-2ad7c0837fa6" containerName="registry-server" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.495938 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fcf7cb-6d10-4298-9884-2ad7c0837fa6" containerName="registry-server" Dec 01 15:29:21 crc kubenswrapper[4931]: E1201 15:29:21.495947 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fcf7cb-6d10-4298-9884-2ad7c0837fa6" containerName="extract-content" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.495955 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fcf7cb-6d10-4298-9884-2ad7c0837fa6" containerName="extract-content" Dec 01 15:29:21 crc kubenswrapper[4931]: E1201 15:29:21.495970 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71619209-8442-4ad2-827f-39148126421e" containerName="extract-content" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.495978 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="71619209-8442-4ad2-827f-39148126421e" containerName="extract-content" Dec 01 15:29:21 crc kubenswrapper[4931]: E1201 15:29:21.495999 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71619209-8442-4ad2-827f-39148126421e" containerName="registry-server" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.496005 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="71619209-8442-4ad2-827f-39148126421e" containerName="registry-server" Dec 01 15:29:21 crc kubenswrapper[4931]: E1201 15:29:21.496018 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fcf7cb-6d10-4298-9884-2ad7c0837fa6" containerName="extract-utilities" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.496024 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fcf7cb-6d10-4298-9884-2ad7c0837fa6" containerName="extract-utilities" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.496202 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fcf7cb-6d10-4298-9884-2ad7c0837fa6" containerName="registry-server" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.496221 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="71619209-8442-4ad2-827f-39148126421e" containerName="registry-server" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.496232 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.496833 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.500925 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.500975 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.501007 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.501423 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.508889 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d"] Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.607357 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d\" (UID: \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.607443 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq7qr\" (UniqueName: \"kubernetes.io/projected/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-kube-api-access-nq7qr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d\" (UID: \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.607548 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d\" (UID: \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.708751 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d\" (UID: \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.708857 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d\" (UID: \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.708923 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq7qr\" (UniqueName: \"kubernetes.io/projected/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-kube-api-access-nq7qr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d\" (UID: \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.712479 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d\" (UID: \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.713003 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d\" (UID: \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.725930 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq7qr\" (UniqueName: \"kubernetes.io/projected/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-kube-api-access-nq7qr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d\" (UID: \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" Dec 01 15:29:21 crc kubenswrapper[4931]: I1201 15:29:21.814218 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" Dec 01 15:29:22 crc kubenswrapper[4931]: I1201 15:29:22.339107 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d"] Dec 01 15:29:22 crc kubenswrapper[4931]: I1201 15:29:22.343664 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 15:29:22 crc kubenswrapper[4931]: I1201 15:29:22.424218 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" event={"ID":"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d","Type":"ContainerStarted","Data":"d17d9d9b59df7ab4db43156d6c751b8af72006192c1e198b4a40642962b488f1"} Dec 01 15:29:23 crc kubenswrapper[4931]: I1201 15:29:23.437022 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" event={"ID":"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d","Type":"ContainerStarted","Data":"579348564d31bc5d4997a4ec08b9f0c552c66327f56c47177e4ee5ff09eb5431"} Dec 01 15:29:23 crc kubenswrapper[4931]: I1201 15:29:23.471319 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" podStartSLOduration=1.632112827 podStartE2EDuration="2.471292334s" podCreationTimestamp="2025-12-01 15:29:21 +0000 UTC" firstStartedPulling="2025-12-01 15:29:22.343461621 +0000 UTC m=+1708.769335288" lastFinishedPulling="2025-12-01 15:29:23.182641128 +0000 UTC m=+1709.608514795" observedRunningTime="2025-12-01 15:29:23.454665521 +0000 UTC m=+1709.880539208" watchObservedRunningTime="2025-12-01 15:29:23.471292334 +0000 UTC m=+1709.897166001" Dec 01 15:29:27 crc kubenswrapper[4931]: I1201 15:29:27.241993 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:29:27 crc kubenswrapper[4931]: E1201 15:29:27.242896 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:29:35 crc kubenswrapper[4931]: I1201 15:29:35.064792 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-r5gs2"] Dec 01 15:29:35 crc kubenswrapper[4931]: I1201 15:29:35.078327 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-r5gs2"] Dec 01 15:29:36 crc kubenswrapper[4931]: I1201 15:29:36.183979 4931 scope.go:117] "RemoveContainer" containerID="fdd7a4c7a92ba9cb758e88893b2f1c5f62dba4a3c7e9298106be6262e54a308e" Dec 01 15:29:36 crc kubenswrapper[4931]: I1201 15:29:36.209238 4931 scope.go:117] "RemoveContainer" containerID="c86bb0a9442da898b51e6e080f2f1403ab11dacd0195a5eeca7b2571ce0d5115" Dec 01 15:29:36 crc kubenswrapper[4931]: I1201 15:29:36.256316 4931 scope.go:117] "RemoveContainer" containerID="d1fa95d1ef0a54347f08ae3286bdbe0c8a631f94b0bd1c090c96edce11e8a2ce" Dec 01 15:29:36 crc kubenswrapper[4931]: I1201 15:29:36.270530 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="188bed20-c6ed-4ead-a520-03ec97974362" path="/var/lib/kubelet/pods/188bed20-c6ed-4ead-a520-03ec97974362/volumes" Dec 01 15:29:36 crc kubenswrapper[4931]: I1201 15:29:36.303312 4931 scope.go:117] "RemoveContainer" containerID="a863911a6a782d716dfc46306a44f46f24cdc14d169cac5410a982b5443dc918" Dec 01 15:29:36 crc kubenswrapper[4931]: I1201 15:29:36.350804 4931 scope.go:117] "RemoveContainer" containerID="4b206d60ad533adca31c307ccb4d0539ee362c2423e697e0fc52abbdabe2aa20" Dec 01 15:29:36 crc kubenswrapper[4931]: I1201 15:29:36.394360 4931 scope.go:117] "RemoveContainer" containerID="f0325033cc8c49f4325bd511aff1e26e25ca14fb3bd79e8105d0bb277ffbc81e" Dec 01 15:29:36 crc kubenswrapper[4931]: I1201 15:29:36.439007 4931 scope.go:117] "RemoveContainer" containerID="3d23c379a6890da2744f5195f6128f44d756b055e73126e3e30e8dd0f1745fdb" Dec 01 15:29:39 crc kubenswrapper[4931]: I1201 15:29:39.063512 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3065-account-create-update-vfvmz"] Dec 01 15:29:39 crc kubenswrapper[4931]: I1201 15:29:39.080340 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-xnxrs"] Dec 01 15:29:39 crc kubenswrapper[4931]: I1201 15:29:39.096807 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-bb76-account-create-update-2jg2w"] Dec 01 15:29:39 crc kubenswrapper[4931]: I1201 15:29:39.104956 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0a7a-account-create-update-9qs6p"] Dec 01 15:29:39 crc kubenswrapper[4931]: I1201 15:29:39.113653 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-8zdh2"] Dec 01 15:29:39 crc kubenswrapper[4931]: I1201 15:29:39.123009 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-8zdh2"] Dec 01 15:29:39 crc kubenswrapper[4931]: I1201 15:29:39.131716 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0a7a-account-create-update-9qs6p"] Dec 01 15:29:39 crc kubenswrapper[4931]: I1201 15:29:39.141428 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-bb76-account-create-update-2jg2w"] Dec 01 15:29:39 crc kubenswrapper[4931]: I1201 15:29:39.148265 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3065-account-create-update-vfvmz"] Dec 01 15:29:39 crc kubenswrapper[4931]: I1201 15:29:39.158302 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-czvkh"] Dec 01 15:29:39 crc kubenswrapper[4931]: I1201 15:29:39.167790 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-czvkh"] Dec 01 15:29:39 crc kubenswrapper[4931]: I1201 15:29:39.175445 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-xnxrs"] Dec 01 15:29:40 crc kubenswrapper[4931]: I1201 15:29:40.259222 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b88a61-4bfa-4704-b3ae-1f61c9b65488" path="/var/lib/kubelet/pods/23b88a61-4bfa-4704-b3ae-1f61c9b65488/volumes" Dec 01 15:29:40 crc kubenswrapper[4931]: I1201 15:29:40.259852 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53a77930-f114-45b8-9256-94d34ad92839" path="/var/lib/kubelet/pods/53a77930-f114-45b8-9256-94d34ad92839/volumes" Dec 01 15:29:40 crc kubenswrapper[4931]: I1201 15:29:40.260475 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c273f1c9-0358-4bcc-a98d-2c3ab0cd677b" path="/var/lib/kubelet/pods/c273f1c9-0358-4bcc-a98d-2c3ab0cd677b/volumes" Dec 01 15:29:40 crc kubenswrapper[4931]: I1201 15:29:40.261162 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f50443-d0a3-4976-943e-0ac0de49b9b9" path="/var/lib/kubelet/pods/d8f50443-d0a3-4976-943e-0ac0de49b9b9/volumes" Dec 01 15:29:40 crc kubenswrapper[4931]: I1201 15:29:40.262168 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f23b7a9b-3694-415b-8e68-d0757a5ccc7a" path="/var/lib/kubelet/pods/f23b7a9b-3694-415b-8e68-d0757a5ccc7a/volumes" Dec 01 15:29:40 crc kubenswrapper[4931]: I1201 15:29:40.262917 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff2a8742-bece-4c6a-9f4d-6c7d1dc97450" path="/var/lib/kubelet/pods/ff2a8742-bece-4c6a-9f4d-6c7d1dc97450/volumes" Dec 01 15:29:42 crc kubenswrapper[4931]: I1201 15:29:42.243479 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:29:42 crc kubenswrapper[4931]: E1201 15:29:42.244191 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:29:46 crc kubenswrapper[4931]: I1201 15:29:46.043819 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-dnbwr"] Dec 01 15:29:46 crc kubenswrapper[4931]: I1201 15:29:46.059557 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-dnbwr"] Dec 01 15:29:46 crc kubenswrapper[4931]: I1201 15:29:46.255196 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeec95d4-0a22-40e6-b1ca-b15703d71b47" path="/var/lib/kubelet/pods/aeec95d4-0a22-40e6-b1ca-b15703d71b47/volumes" Dec 01 15:29:53 crc kubenswrapper[4931]: I1201 15:29:53.241701 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:29:53 crc kubenswrapper[4931]: E1201 15:29:53.242328 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.140281 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz"] Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.142307 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.145420 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.145532 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.170011 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz"] Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.247340 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/268742df-a602-429b-887a-f25239e66bfb-config-volume\") pod \"collect-profiles-29410050-p9hrz\" (UID: \"268742df-a602-429b-887a-f25239e66bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.247451 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/268742df-a602-429b-887a-f25239e66bfb-secret-volume\") pod \"collect-profiles-29410050-p9hrz\" (UID: \"268742df-a602-429b-887a-f25239e66bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.247636 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47xq6\" (UniqueName: \"kubernetes.io/projected/268742df-a602-429b-887a-f25239e66bfb-kube-api-access-47xq6\") pod \"collect-profiles-29410050-p9hrz\" (UID: \"268742df-a602-429b-887a-f25239e66bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.348907 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/268742df-a602-429b-887a-f25239e66bfb-secret-volume\") pod \"collect-profiles-29410050-p9hrz\" (UID: \"268742df-a602-429b-887a-f25239e66bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.349028 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47xq6\" (UniqueName: \"kubernetes.io/projected/268742df-a602-429b-887a-f25239e66bfb-kube-api-access-47xq6\") pod \"collect-profiles-29410050-p9hrz\" (UID: \"268742df-a602-429b-887a-f25239e66bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.349140 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/268742df-a602-429b-887a-f25239e66bfb-config-volume\") pod \"collect-profiles-29410050-p9hrz\" (UID: \"268742df-a602-429b-887a-f25239e66bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.350138 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/268742df-a602-429b-887a-f25239e66bfb-config-volume\") pod \"collect-profiles-29410050-p9hrz\" (UID: \"268742df-a602-429b-887a-f25239e66bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.355146 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/268742df-a602-429b-887a-f25239e66bfb-secret-volume\") pod \"collect-profiles-29410050-p9hrz\" (UID: \"268742df-a602-429b-887a-f25239e66bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.379063 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47xq6\" (UniqueName: \"kubernetes.io/projected/268742df-a602-429b-887a-f25239e66bfb-kube-api-access-47xq6\") pod \"collect-profiles-29410050-p9hrz\" (UID: \"268742df-a602-429b-887a-f25239e66bfb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.468083 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" Dec 01 15:30:00 crc kubenswrapper[4931]: I1201 15:30:00.898848 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz"] Dec 01 15:30:01 crc kubenswrapper[4931]: I1201 15:30:01.798916 4931 generic.go:334] "Generic (PLEG): container finished" podID="268742df-a602-429b-887a-f25239e66bfb" containerID="cac391e86c5b6293e162f64a542ad08514661733c1f36c02475665e284023d97" exitCode=0 Dec 01 15:30:01 crc kubenswrapper[4931]: I1201 15:30:01.798975 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" event={"ID":"268742df-a602-429b-887a-f25239e66bfb","Type":"ContainerDied","Data":"cac391e86c5b6293e162f64a542ad08514661733c1f36c02475665e284023d97"} Dec 01 15:30:01 crc kubenswrapper[4931]: I1201 15:30:01.799363 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" event={"ID":"268742df-a602-429b-887a-f25239e66bfb","Type":"ContainerStarted","Data":"47fb55244b72f2dc31f83af14700a3ea9cad64a670c59e5070f5c668df35079a"} Dec 01 15:30:03 crc kubenswrapper[4931]: I1201 15:30:03.139678 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" Dec 01 15:30:03 crc kubenswrapper[4931]: I1201 15:30:03.195452 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/268742df-a602-429b-887a-f25239e66bfb-config-volume\") pod \"268742df-a602-429b-887a-f25239e66bfb\" (UID: \"268742df-a602-429b-887a-f25239e66bfb\") " Dec 01 15:30:03 crc kubenswrapper[4931]: I1201 15:30:03.195592 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/268742df-a602-429b-887a-f25239e66bfb-secret-volume\") pod \"268742df-a602-429b-887a-f25239e66bfb\" (UID: \"268742df-a602-429b-887a-f25239e66bfb\") " Dec 01 15:30:03 crc kubenswrapper[4931]: I1201 15:30:03.195822 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47xq6\" (UniqueName: \"kubernetes.io/projected/268742df-a602-429b-887a-f25239e66bfb-kube-api-access-47xq6\") pod \"268742df-a602-429b-887a-f25239e66bfb\" (UID: \"268742df-a602-429b-887a-f25239e66bfb\") " Dec 01 15:30:03 crc kubenswrapper[4931]: I1201 15:30:03.196292 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/268742df-a602-429b-887a-f25239e66bfb-config-volume" (OuterVolumeSpecName: "config-volume") pod "268742df-a602-429b-887a-f25239e66bfb" (UID: "268742df-a602-429b-887a-f25239e66bfb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:30:03 crc kubenswrapper[4931]: I1201 15:30:03.196629 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/268742df-a602-429b-887a-f25239e66bfb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 15:30:03 crc kubenswrapper[4931]: I1201 15:30:03.203037 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268742df-a602-429b-887a-f25239e66bfb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "268742df-a602-429b-887a-f25239e66bfb" (UID: "268742df-a602-429b-887a-f25239e66bfb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:30:03 crc kubenswrapper[4931]: I1201 15:30:03.206499 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268742df-a602-429b-887a-f25239e66bfb-kube-api-access-47xq6" (OuterVolumeSpecName: "kube-api-access-47xq6") pod "268742df-a602-429b-887a-f25239e66bfb" (UID: "268742df-a602-429b-887a-f25239e66bfb"). InnerVolumeSpecName "kube-api-access-47xq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:30:03 crc kubenswrapper[4931]: I1201 15:30:03.298963 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47xq6\" (UniqueName: \"kubernetes.io/projected/268742df-a602-429b-887a-f25239e66bfb-kube-api-access-47xq6\") on node \"crc\" DevicePath \"\"" Dec 01 15:30:03 crc kubenswrapper[4931]: I1201 15:30:03.299004 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/268742df-a602-429b-887a-f25239e66bfb-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 15:30:03 crc kubenswrapper[4931]: I1201 15:30:03.822053 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" event={"ID":"268742df-a602-429b-887a-f25239e66bfb","Type":"ContainerDied","Data":"47fb55244b72f2dc31f83af14700a3ea9cad64a670c59e5070f5c668df35079a"} Dec 01 15:30:03 crc kubenswrapper[4931]: I1201 15:30:03.822095 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz" Dec 01 15:30:03 crc kubenswrapper[4931]: I1201 15:30:03.822100 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47fb55244b72f2dc31f83af14700a3ea9cad64a670c59e5070f5c668df35079a" Dec 01 15:30:05 crc kubenswrapper[4931]: I1201 15:30:05.241296 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:30:05 crc kubenswrapper[4931]: E1201 15:30:05.241817 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:30:18 crc kubenswrapper[4931]: I1201 15:30:18.241363 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:30:18 crc kubenswrapper[4931]: E1201 15:30:18.241968 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:30:29 crc kubenswrapper[4931]: I1201 15:30:29.241761 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:30:29 crc kubenswrapper[4931]: E1201 15:30:29.243453 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:30:36 crc kubenswrapper[4931]: I1201 15:30:36.588557 4931 scope.go:117] "RemoveContainer" containerID="9cc39002084d2cf94ab0f3ca359a9afd9479c8887132254516d171653c256b6a" Dec 01 15:30:36 crc kubenswrapper[4931]: I1201 15:30:36.632123 4931 scope.go:117] "RemoveContainer" containerID="36e2ad7b2c9c6bb49ffeb3b09196062a70b6c92c956052032a50f8e33d3593a4" Dec 01 15:30:36 crc kubenswrapper[4931]: I1201 15:30:36.661180 4931 scope.go:117] "RemoveContainer" containerID="474562a1b39cf4e490f292ca2d98e8481d1c879f2289066e262720282c3de466" Dec 01 15:30:36 crc kubenswrapper[4931]: I1201 15:30:36.724981 4931 scope.go:117] "RemoveContainer" containerID="7aa32bfcb52781fe815cf9f97649051984403a6c884e00198060ca109fee8b7b" Dec 01 15:30:36 crc kubenswrapper[4931]: I1201 15:30:36.761100 4931 scope.go:117] "RemoveContainer" containerID="e0c0899b3dd8a7aeb7ca184ab3e643ef350bb7efa3a2bdd04d57cf96832c9f2f" Dec 01 15:30:36 crc kubenswrapper[4931]: I1201 15:30:36.806163 4931 scope.go:117] "RemoveContainer" containerID="393e97e8f435b50a3f01a9d0061e12c6f41976aefe289a2909ec415ee54a8664" Dec 01 15:30:36 crc kubenswrapper[4931]: I1201 15:30:36.843912 4931 scope.go:117] "RemoveContainer" containerID="3d42c6bf96381e4cee89b6dc4c52cac055b381c0c484c3f3bcea642acc9ae693" Dec 01 15:30:42 crc kubenswrapper[4931]: I1201 15:30:42.242116 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:30:42 crc kubenswrapper[4931]: E1201 15:30:42.242946 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:30:47 crc kubenswrapper[4931]: I1201 15:30:47.037465 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-x5lvn"] Dec 01 15:30:47 crc kubenswrapper[4931]: I1201 15:30:47.045305 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-x5lvn"] Dec 01 15:30:48 crc kubenswrapper[4931]: I1201 15:30:48.253620 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a78a28-a980-4393-bd41-e0f523fccf7e" path="/var/lib/kubelet/pods/04a78a28-a980-4393-bd41-e0f523fccf7e/volumes" Dec 01 15:30:49 crc kubenswrapper[4931]: I1201 15:30:49.037079 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ffphc"] Dec 01 15:30:49 crc kubenswrapper[4931]: I1201 15:30:49.054129 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ffphc"] Dec 01 15:30:50 crc kubenswrapper[4931]: I1201 15:30:50.256771 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f515fe-6925-463c-b5dc-87b23d360ec5" path="/var/lib/kubelet/pods/38f515fe-6925-463c-b5dc-87b23d360ec5/volumes" Dec 01 15:30:54 crc kubenswrapper[4931]: I1201 15:30:54.251563 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:30:54 crc kubenswrapper[4931]: E1201 15:30:54.253164 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:30:56 crc kubenswrapper[4931]: I1201 15:30:56.030763 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-9qzq8"] Dec 01 15:30:56 crc kubenswrapper[4931]: I1201 15:30:56.038981 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-qsr4v"] Dec 01 15:30:56 crc kubenswrapper[4931]: I1201 15:30:56.065771 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-9qzq8"] Dec 01 15:30:56 crc kubenswrapper[4931]: I1201 15:30:56.068410 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-qsr4v"] Dec 01 15:30:56 crc kubenswrapper[4931]: I1201 15:30:56.255376 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a79a9139-dcea-4b3f-83dc-a1715f087ac5" path="/var/lib/kubelet/pods/a79a9139-dcea-4b3f-83dc-a1715f087ac5/volumes" Dec 01 15:30:56 crc kubenswrapper[4931]: I1201 15:30:56.256162 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ada3f9-8074-48c1-a190-d6535e26e14f" path="/var/lib/kubelet/pods/f9ada3f9-8074-48c1-a190-d6535e26e14f/volumes" Dec 01 15:30:57 crc kubenswrapper[4931]: I1201 15:30:57.042082 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-cmc5n"] Dec 01 15:30:57 crc kubenswrapper[4931]: I1201 15:30:57.049671 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-cmc5n"] Dec 01 15:30:58 crc kubenswrapper[4931]: I1201 15:30:58.260823 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="658da6f1-ac70-4d83-83ca-f79e69f0979d" path="/var/lib/kubelet/pods/658da6f1-ac70-4d83-83ca-f79e69f0979d/volumes" Dec 01 15:30:59 crc kubenswrapper[4931]: I1201 15:30:59.426808 4931 generic.go:334] "Generic (PLEG): container finished" podID="3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d" containerID="579348564d31bc5d4997a4ec08b9f0c552c66327f56c47177e4ee5ff09eb5431" exitCode=0 Dec 01 15:30:59 crc kubenswrapper[4931]: I1201 15:30:59.427252 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" event={"ID":"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d","Type":"ContainerDied","Data":"579348564d31bc5d4997a4ec08b9f0c552c66327f56c47177e4ee5ff09eb5431"} Dec 01 15:31:00 crc kubenswrapper[4931]: I1201 15:31:00.841172 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.022082 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-inventory\") pod \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\" (UID: \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\") " Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.022211 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-ssh-key\") pod \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\" (UID: \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\") " Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.022236 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq7qr\" (UniqueName: \"kubernetes.io/projected/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-kube-api-access-nq7qr\") pod \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\" (UID: \"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d\") " Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.032260 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-kube-api-access-nq7qr" (OuterVolumeSpecName: "kube-api-access-nq7qr") pod "3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d" (UID: "3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d"). InnerVolumeSpecName "kube-api-access-nq7qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.048523 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-inventory" (OuterVolumeSpecName: "inventory") pod "3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d" (UID: "3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.073558 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d" (UID: "3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.125459 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.125501 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.125514 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq7qr\" (UniqueName: \"kubernetes.io/projected/3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d-kube-api-access-nq7qr\") on node \"crc\" DevicePath \"\"" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.445309 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" event={"ID":"3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d","Type":"ContainerDied","Data":"d17d9d9b59df7ab4db43156d6c751b8af72006192c1e198b4a40642962b488f1"} Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.445575 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d17d9d9b59df7ab4db43156d6c751b8af72006192c1e198b4a40642962b488f1" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.445428 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.529164 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv"] Dec 01 15:31:01 crc kubenswrapper[4931]: E1201 15:31:01.529672 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268742df-a602-429b-887a-f25239e66bfb" containerName="collect-profiles" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.529692 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="268742df-a602-429b-887a-f25239e66bfb" containerName="collect-profiles" Dec 01 15:31:01 crc kubenswrapper[4931]: E1201 15:31:01.529730 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.529739 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.530019 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.530052 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="268742df-a602-429b-887a-f25239e66bfb" containerName="collect-profiles" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.531077 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.533079 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.533230 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.533984 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.534545 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.537603 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv"] Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.635943 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w677f\" (UniqueName: \"kubernetes.io/projected/ff92bfe2-2afc-4cc2-9317-db96b912117c-kube-api-access-w677f\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-65xbv\" (UID: \"ff92bfe2-2afc-4cc2-9317-db96b912117c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.636003 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff92bfe2-2afc-4cc2-9317-db96b912117c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-65xbv\" (UID: \"ff92bfe2-2afc-4cc2-9317-db96b912117c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.636274 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff92bfe2-2afc-4cc2-9317-db96b912117c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-65xbv\" (UID: \"ff92bfe2-2afc-4cc2-9317-db96b912117c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.738636 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w677f\" (UniqueName: \"kubernetes.io/projected/ff92bfe2-2afc-4cc2-9317-db96b912117c-kube-api-access-w677f\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-65xbv\" (UID: \"ff92bfe2-2afc-4cc2-9317-db96b912117c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.738712 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff92bfe2-2afc-4cc2-9317-db96b912117c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-65xbv\" (UID: \"ff92bfe2-2afc-4cc2-9317-db96b912117c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.738799 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff92bfe2-2afc-4cc2-9317-db96b912117c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-65xbv\" (UID: \"ff92bfe2-2afc-4cc2-9317-db96b912117c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.745091 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff92bfe2-2afc-4cc2-9317-db96b912117c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-65xbv\" (UID: \"ff92bfe2-2afc-4cc2-9317-db96b912117c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.745899 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff92bfe2-2afc-4cc2-9317-db96b912117c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-65xbv\" (UID: \"ff92bfe2-2afc-4cc2-9317-db96b912117c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.766426 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w677f\" (UniqueName: \"kubernetes.io/projected/ff92bfe2-2afc-4cc2-9317-db96b912117c-kube-api-access-w677f\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-65xbv\" (UID: \"ff92bfe2-2afc-4cc2-9317-db96b912117c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" Dec 01 15:31:01 crc kubenswrapper[4931]: I1201 15:31:01.852853 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" Dec 01 15:31:02 crc kubenswrapper[4931]: I1201 15:31:02.404863 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv"] Dec 01 15:31:02 crc kubenswrapper[4931]: I1201 15:31:02.458286 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" event={"ID":"ff92bfe2-2afc-4cc2-9317-db96b912117c","Type":"ContainerStarted","Data":"01fbd94e9f4e095f9f704f3b0e5c1ab9e1ad61dc49539991255ea0516b9518de"} Dec 01 15:31:03 crc kubenswrapper[4931]: I1201 15:31:03.470624 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" event={"ID":"ff92bfe2-2afc-4cc2-9317-db96b912117c","Type":"ContainerStarted","Data":"b6644b93d6f0d586482e8e5732ca9742b680b94b1aa62466aaab27a3a429d650"} Dec 01 15:31:03 crc kubenswrapper[4931]: I1201 15:31:03.486648 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" podStartSLOduration=2.018655069 podStartE2EDuration="2.486632659s" podCreationTimestamp="2025-12-01 15:31:01 +0000 UTC" firstStartedPulling="2025-12-01 15:31:02.400173143 +0000 UTC m=+1808.826046810" lastFinishedPulling="2025-12-01 15:31:02.868150723 +0000 UTC m=+1809.294024400" observedRunningTime="2025-12-01 15:31:03.484452107 +0000 UTC m=+1809.910325794" watchObservedRunningTime="2025-12-01 15:31:03.486632659 +0000 UTC m=+1809.912506326" Dec 01 15:31:09 crc kubenswrapper[4931]: I1201 15:31:09.241091 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:31:09 crc kubenswrapper[4931]: E1201 15:31:09.241787 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:31:24 crc kubenswrapper[4931]: I1201 15:31:24.251310 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:31:24 crc kubenswrapper[4931]: E1201 15:31:24.252047 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:31:36 crc kubenswrapper[4931]: I1201 15:31:36.241167 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:31:36 crc kubenswrapper[4931]: E1201 15:31:36.241932 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:31:36 crc kubenswrapper[4931]: I1201 15:31:36.991573 4931 scope.go:117] "RemoveContainer" containerID="62e26de25f4e45ec1dd95fc4d57b3fd2474c0eb0b45e67d57e026a25c974903c" Dec 01 15:31:37 crc kubenswrapper[4931]: I1201 15:31:37.037810 4931 scope.go:117] "RemoveContainer" containerID="04b2d3391b567348655c3775a2e83ca33406614fb846da43e8e1f65cb26f4da3" Dec 01 15:31:37 crc kubenswrapper[4931]: I1201 15:31:37.077138 4931 scope.go:117] "RemoveContainer" containerID="19cc472e6fe2a229b8b7de1d3e4ad6fbc2d6a34025debd2ed8dd363218082cf4" Dec 01 15:31:37 crc kubenswrapper[4931]: I1201 15:31:37.112866 4931 scope.go:117] "RemoveContainer" containerID="bff0594af606cbd38e8a481f84acc561c9cabc0f5f3c71429ae70be32efce92d" Dec 01 15:31:37 crc kubenswrapper[4931]: I1201 15:31:37.159498 4931 scope.go:117] "RemoveContainer" containerID="34c8c9011e6fd28dfc61e7a80b19ed99873729e8f31c22ff568a0cdb447142e7" Dec 01 15:31:38 crc kubenswrapper[4931]: I1201 15:31:38.067804 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-8bv8v"] Dec 01 15:31:38 crc kubenswrapper[4931]: I1201 15:31:38.085761 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c460-account-create-update-bp94q"] Dec 01 15:31:38 crc kubenswrapper[4931]: I1201 15:31:38.095359 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c460-account-create-update-bp94q"] Dec 01 15:31:38 crc kubenswrapper[4931]: I1201 15:31:38.105180 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-8bv8v"] Dec 01 15:31:38 crc kubenswrapper[4931]: I1201 15:31:38.251435 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51cb7657-b8e4-43c0-b97a-633ae04b743d" path="/var/lib/kubelet/pods/51cb7657-b8e4-43c0-b97a-633ae04b743d/volumes" Dec 01 15:31:38 crc kubenswrapper[4931]: I1201 15:31:38.252347 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b99e4c2-d0b8-4206-a0cf-45f43d0557cf" path="/var/lib/kubelet/pods/8b99e4c2-d0b8-4206-a0cf-45f43d0557cf/volumes" Dec 01 15:31:39 crc kubenswrapper[4931]: I1201 15:31:39.044236 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6966-account-create-update-9b6ww"] Dec 01 15:31:39 crc kubenswrapper[4931]: I1201 15:31:39.057522 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6966-account-create-update-9b6ww"] Dec 01 15:31:39 crc kubenswrapper[4931]: I1201 15:31:39.073333 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-pt2d4"] Dec 01 15:31:39 crc kubenswrapper[4931]: I1201 15:31:39.091309 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-pt2d4"] Dec 01 15:31:40 crc kubenswrapper[4931]: I1201 15:31:40.032345 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-4s7mh"] Dec 01 15:31:40 crc kubenswrapper[4931]: I1201 15:31:40.042502 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f730-account-create-update-g9cxx"] Dec 01 15:31:40 crc kubenswrapper[4931]: I1201 15:31:40.050350 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-4s7mh"] Dec 01 15:31:40 crc kubenswrapper[4931]: I1201 15:31:40.056963 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f730-account-create-update-g9cxx"] Dec 01 15:31:40 crc kubenswrapper[4931]: I1201 15:31:40.251717 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37513865-ccd3-4ad7-89f0-66e1f3f6b9a4" path="/var/lib/kubelet/pods/37513865-ccd3-4ad7-89f0-66e1f3f6b9a4/volumes" Dec 01 15:31:40 crc kubenswrapper[4931]: I1201 15:31:40.252322 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55f40fe4-164c-4aad-9644-d509c905673f" path="/var/lib/kubelet/pods/55f40fe4-164c-4aad-9644-d509c905673f/volumes" Dec 01 15:31:40 crc kubenswrapper[4931]: I1201 15:31:40.252861 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="680b6f89-d743-4467-b9da-68a831e24fa9" path="/var/lib/kubelet/pods/680b6f89-d743-4467-b9da-68a831e24fa9/volumes" Dec 01 15:31:40 crc kubenswrapper[4931]: I1201 15:31:40.253383 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d8e3fe-15ab-4d9d-939d-4198c8571597" path="/var/lib/kubelet/pods/f8d8e3fe-15ab-4d9d-939d-4198c8571597/volumes" Dec 01 15:31:49 crc kubenswrapper[4931]: I1201 15:31:49.241690 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:31:49 crc kubenswrapper[4931]: E1201 15:31:49.242594 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:32:04 crc kubenswrapper[4931]: I1201 15:32:04.248776 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:32:04 crc kubenswrapper[4931]: E1201 15:32:04.249844 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:32:11 crc kubenswrapper[4931]: I1201 15:32:11.050647 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lbnzw"] Dec 01 15:32:11 crc kubenswrapper[4931]: I1201 15:32:11.059144 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lbnzw"] Dec 01 15:32:12 crc kubenswrapper[4931]: I1201 15:32:12.259092 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e39a0af-5976-435f-b5c5-6e1e820ea761" path="/var/lib/kubelet/pods/2e39a0af-5976-435f-b5c5-6e1e820ea761/volumes" Dec 01 15:32:18 crc kubenswrapper[4931]: I1201 15:32:18.241688 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:32:18 crc kubenswrapper[4931]: E1201 15:32:18.242521 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:32:18 crc kubenswrapper[4931]: I1201 15:32:18.291857 4931 generic.go:334] "Generic (PLEG): container finished" podID="ff92bfe2-2afc-4cc2-9317-db96b912117c" containerID="b6644b93d6f0d586482e8e5732ca9742b680b94b1aa62466aaab27a3a429d650" exitCode=0 Dec 01 15:32:18 crc kubenswrapper[4931]: I1201 15:32:18.291896 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" event={"ID":"ff92bfe2-2afc-4cc2-9317-db96b912117c","Type":"ContainerDied","Data":"b6644b93d6f0d586482e8e5732ca9742b680b94b1aa62466aaab27a3a429d650"} Dec 01 15:32:19 crc kubenswrapper[4931]: I1201 15:32:19.666638 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" Dec 01 15:32:19 crc kubenswrapper[4931]: I1201 15:32:19.848375 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff92bfe2-2afc-4cc2-9317-db96b912117c-inventory\") pod \"ff92bfe2-2afc-4cc2-9317-db96b912117c\" (UID: \"ff92bfe2-2afc-4cc2-9317-db96b912117c\") " Dec 01 15:32:19 crc kubenswrapper[4931]: I1201 15:32:19.848620 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w677f\" (UniqueName: \"kubernetes.io/projected/ff92bfe2-2afc-4cc2-9317-db96b912117c-kube-api-access-w677f\") pod \"ff92bfe2-2afc-4cc2-9317-db96b912117c\" (UID: \"ff92bfe2-2afc-4cc2-9317-db96b912117c\") " Dec 01 15:32:19 crc kubenswrapper[4931]: I1201 15:32:19.848712 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff92bfe2-2afc-4cc2-9317-db96b912117c-ssh-key\") pod \"ff92bfe2-2afc-4cc2-9317-db96b912117c\" (UID: \"ff92bfe2-2afc-4cc2-9317-db96b912117c\") " Dec 01 15:32:19 crc kubenswrapper[4931]: I1201 15:32:19.868946 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff92bfe2-2afc-4cc2-9317-db96b912117c-kube-api-access-w677f" (OuterVolumeSpecName: "kube-api-access-w677f") pod "ff92bfe2-2afc-4cc2-9317-db96b912117c" (UID: "ff92bfe2-2afc-4cc2-9317-db96b912117c"). InnerVolumeSpecName "kube-api-access-w677f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:32:19 crc kubenswrapper[4931]: I1201 15:32:19.876534 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff92bfe2-2afc-4cc2-9317-db96b912117c-inventory" (OuterVolumeSpecName: "inventory") pod "ff92bfe2-2afc-4cc2-9317-db96b912117c" (UID: "ff92bfe2-2afc-4cc2-9317-db96b912117c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:32:19 crc kubenswrapper[4931]: I1201 15:32:19.888823 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff92bfe2-2afc-4cc2-9317-db96b912117c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ff92bfe2-2afc-4cc2-9317-db96b912117c" (UID: "ff92bfe2-2afc-4cc2-9317-db96b912117c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:32:19 crc kubenswrapper[4931]: I1201 15:32:19.950953 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff92bfe2-2afc-4cc2-9317-db96b912117c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:32:19 crc kubenswrapper[4931]: I1201 15:32:19.950989 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff92bfe2-2afc-4cc2-9317-db96b912117c-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:32:19 crc kubenswrapper[4931]: I1201 15:32:19.950998 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w677f\" (UniqueName: \"kubernetes.io/projected/ff92bfe2-2afc-4cc2-9317-db96b912117c-kube-api-access-w677f\") on node \"crc\" DevicePath \"\"" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.308859 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" event={"ID":"ff92bfe2-2afc-4cc2-9317-db96b912117c","Type":"ContainerDied","Data":"01fbd94e9f4e095f9f704f3b0e5c1ab9e1ad61dc49539991255ea0516b9518de"} Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.309185 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01fbd94e9f4e095f9f704f3b0e5c1ab9e1ad61dc49539991255ea0516b9518de" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.308981 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-65xbv" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.402895 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8"] Dec 01 15:32:20 crc kubenswrapper[4931]: E1201 15:32:20.403356 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff92bfe2-2afc-4cc2-9317-db96b912117c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.403406 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff92bfe2-2afc-4cc2-9317-db96b912117c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.403843 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff92bfe2-2afc-4cc2-9317-db96b912117c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.404783 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.407273 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.407561 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.407658 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.409658 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.412549 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8"] Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.572463 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8\" (UID: \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.572538 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8\" (UID: \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.572619 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwpvf\" (UniqueName: \"kubernetes.io/projected/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-kube-api-access-fwpvf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8\" (UID: \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.674373 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8\" (UID: \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.674539 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwpvf\" (UniqueName: \"kubernetes.io/projected/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-kube-api-access-fwpvf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8\" (UID: \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.674617 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8\" (UID: \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.688367 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8\" (UID: \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.699795 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8\" (UID: \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.715002 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwpvf\" (UniqueName: \"kubernetes.io/projected/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-kube-api-access-fwpvf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8\" (UID: \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" Dec 01 15:32:20 crc kubenswrapper[4931]: I1201 15:32:20.722630 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" Dec 01 15:32:21 crc kubenswrapper[4931]: I1201 15:32:21.236284 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8"] Dec 01 15:32:21 crc kubenswrapper[4931]: I1201 15:32:21.318429 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" event={"ID":"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5","Type":"ContainerStarted","Data":"be39730cecc16b049b66286d55bc2b47ecfe024503ca22ee346900184c6eaeb7"} Dec 01 15:32:22 crc kubenswrapper[4931]: I1201 15:32:22.329066 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" event={"ID":"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5","Type":"ContainerStarted","Data":"873aabe55764b32462e0227670e67d9fd8e4011c935feb01e78a3a566aaba4a7"} Dec 01 15:32:22 crc kubenswrapper[4931]: I1201 15:32:22.348336 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" podStartSLOduration=1.7793261569999999 podStartE2EDuration="2.348317881s" podCreationTimestamp="2025-12-01 15:32:20 +0000 UTC" firstStartedPulling="2025-12-01 15:32:21.243462176 +0000 UTC m=+1887.669335843" lastFinishedPulling="2025-12-01 15:32:21.8124539 +0000 UTC m=+1888.238327567" observedRunningTime="2025-12-01 15:32:22.344641995 +0000 UTC m=+1888.770515672" watchObservedRunningTime="2025-12-01 15:32:22.348317881 +0000 UTC m=+1888.774191548" Dec 01 15:32:27 crc kubenswrapper[4931]: I1201 15:32:27.376249 4931 generic.go:334] "Generic (PLEG): container finished" podID="e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5" containerID="873aabe55764b32462e0227670e67d9fd8e4011c935feb01e78a3a566aaba4a7" exitCode=0 Dec 01 15:32:27 crc kubenswrapper[4931]: I1201 15:32:27.376343 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" event={"ID":"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5","Type":"ContainerDied","Data":"873aabe55764b32462e0227670e67d9fd8e4011c935feb01e78a3a566aaba4a7"} Dec 01 15:32:28 crc kubenswrapper[4931]: I1201 15:32:28.763957 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" Dec 01 15:32:28 crc kubenswrapper[4931]: I1201 15:32:28.933752 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-ssh-key\") pod \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\" (UID: \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\") " Dec 01 15:32:28 crc kubenswrapper[4931]: I1201 15:32:28.934152 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-inventory\") pod \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\" (UID: \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\") " Dec 01 15:32:28 crc kubenswrapper[4931]: I1201 15:32:28.934327 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwpvf\" (UniqueName: \"kubernetes.io/projected/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-kube-api-access-fwpvf\") pod \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\" (UID: \"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5\") " Dec 01 15:32:28 crc kubenswrapper[4931]: I1201 15:32:28.939700 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-kube-api-access-fwpvf" (OuterVolumeSpecName: "kube-api-access-fwpvf") pod "e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5" (UID: "e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5"). InnerVolumeSpecName "kube-api-access-fwpvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:32:28 crc kubenswrapper[4931]: I1201 15:32:28.966379 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5" (UID: "e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:32:28 crc kubenswrapper[4931]: I1201 15:32:28.967059 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-inventory" (OuterVolumeSpecName: "inventory") pod "e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5" (UID: "e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.029156 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-ff6b6"] Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.037663 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-ff6b6"] Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.038214 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.038238 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.038250 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwpvf\" (UniqueName: \"kubernetes.io/projected/e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5-kube-api-access-fwpvf\") on node \"crc\" DevicePath \"\"" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.407246 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" event={"ID":"e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5","Type":"ContainerDied","Data":"be39730cecc16b049b66286d55bc2b47ecfe024503ca22ee346900184c6eaeb7"} Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.407573 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be39730cecc16b049b66286d55bc2b47ecfe024503ca22ee346900184c6eaeb7" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.407463 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.468829 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq"] Dec 01 15:32:29 crc kubenswrapper[4931]: E1201 15:32:29.469234 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.469256 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.469466 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.470077 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.474933 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.475091 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.475667 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.476313 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.490339 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq"] Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.650557 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c09d1f20-7083-4a46-bb55-734481a5d66c-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-476jq\" (UID: \"c09d1f20-7083-4a46-bb55-734481a5d66c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.650637 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09d1f20-7083-4a46-bb55-734481a5d66c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-476jq\" (UID: \"c09d1f20-7083-4a46-bb55-734481a5d66c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.650723 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbvmx\" (UniqueName: \"kubernetes.io/projected/c09d1f20-7083-4a46-bb55-734481a5d66c-kube-api-access-bbvmx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-476jq\" (UID: \"c09d1f20-7083-4a46-bb55-734481a5d66c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.752683 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c09d1f20-7083-4a46-bb55-734481a5d66c-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-476jq\" (UID: \"c09d1f20-7083-4a46-bb55-734481a5d66c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.752754 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09d1f20-7083-4a46-bb55-734481a5d66c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-476jq\" (UID: \"c09d1f20-7083-4a46-bb55-734481a5d66c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.752808 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbvmx\" (UniqueName: \"kubernetes.io/projected/c09d1f20-7083-4a46-bb55-734481a5d66c-kube-api-access-bbvmx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-476jq\" (UID: \"c09d1f20-7083-4a46-bb55-734481a5d66c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.756164 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c09d1f20-7083-4a46-bb55-734481a5d66c-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-476jq\" (UID: \"c09d1f20-7083-4a46-bb55-734481a5d66c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.757873 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09d1f20-7083-4a46-bb55-734481a5d66c-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-476jq\" (UID: \"c09d1f20-7083-4a46-bb55-734481a5d66c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.770425 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbvmx\" (UniqueName: \"kubernetes.io/projected/c09d1f20-7083-4a46-bb55-734481a5d66c-kube-api-access-bbvmx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-476jq\" (UID: \"c09d1f20-7083-4a46-bb55-734481a5d66c\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" Dec 01 15:32:29 crc kubenswrapper[4931]: I1201 15:32:29.787635 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" Dec 01 15:32:30 crc kubenswrapper[4931]: I1201 15:32:30.045449 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l9gbm"] Dec 01 15:32:30 crc kubenswrapper[4931]: I1201 15:32:30.055243 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l9gbm"] Dec 01 15:32:30 crc kubenswrapper[4931]: I1201 15:32:30.256962 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="291244d6-d533-45e1-856d-c12e69935ca7" path="/var/lib/kubelet/pods/291244d6-d533-45e1-856d-c12e69935ca7/volumes" Dec 01 15:32:30 crc kubenswrapper[4931]: I1201 15:32:30.257688 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d7a79d2-c2cd-4b85-a307-dc7a40d55861" path="/var/lib/kubelet/pods/2d7a79d2-c2cd-4b85-a307-dc7a40d55861/volumes" Dec 01 15:32:30 crc kubenswrapper[4931]: I1201 15:32:30.385062 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq"] Dec 01 15:32:30 crc kubenswrapper[4931]: I1201 15:32:30.417040 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" event={"ID":"c09d1f20-7083-4a46-bb55-734481a5d66c","Type":"ContainerStarted","Data":"9fef2dda045d4d8b51a328aa4d3b794cb0448e198ec68385f8443eb4b1c53c60"} Dec 01 15:32:31 crc kubenswrapper[4931]: I1201 15:32:31.242225 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:32:31 crc kubenswrapper[4931]: E1201 15:32:31.242771 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:32:31 crc kubenswrapper[4931]: I1201 15:32:31.429691 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" event={"ID":"c09d1f20-7083-4a46-bb55-734481a5d66c","Type":"ContainerStarted","Data":"fa77d3bba59ed79c184deaa6f085ea27845bf871c1a3cb7dc9de83af30b7fefe"} Dec 01 15:32:31 crc kubenswrapper[4931]: I1201 15:32:31.448986 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" podStartSLOduration=1.9544252260000001 podStartE2EDuration="2.448964639s" podCreationTimestamp="2025-12-01 15:32:29 +0000 UTC" firstStartedPulling="2025-12-01 15:32:30.388561203 +0000 UTC m=+1896.814434870" lastFinishedPulling="2025-12-01 15:32:30.883100616 +0000 UTC m=+1897.308974283" observedRunningTime="2025-12-01 15:32:31.447814536 +0000 UTC m=+1897.873688233" watchObservedRunningTime="2025-12-01 15:32:31.448964639 +0000 UTC m=+1897.874838346" Dec 01 15:32:37 crc kubenswrapper[4931]: I1201 15:32:37.281239 4931 scope.go:117] "RemoveContainer" containerID="31592dab405e06f3a6d3afe140f492b10a0efc822f01faa36a8ffaf31da88ac4" Dec 01 15:32:37 crc kubenswrapper[4931]: I1201 15:32:37.312314 4931 scope.go:117] "RemoveContainer" containerID="e3cd6179438f9e93cc521c11ddd66e62b79fc89d780242d2b02cdbbed8005172" Dec 01 15:32:37 crc kubenswrapper[4931]: I1201 15:32:37.361689 4931 scope.go:117] "RemoveContainer" containerID="6f6c2ab96e502e1e1082e5c5fc79b7f1a5c8684fdd852436e087527d8efec353" Dec 01 15:32:37 crc kubenswrapper[4931]: I1201 15:32:37.416476 4931 scope.go:117] "RemoveContainer" containerID="6430c816fc522c0854d435f13fb666786548ec37385c0ed408d4badbc7d87a50" Dec 01 15:32:37 crc kubenswrapper[4931]: I1201 15:32:37.458454 4931 scope.go:117] "RemoveContainer" containerID="e4484967bafb645eb818b6fb9fd830cfe8e8c6470521ebe5a02ee0610bdae3da" Dec 01 15:32:37 crc kubenswrapper[4931]: I1201 15:32:37.500803 4931 scope.go:117] "RemoveContainer" containerID="719334c10940453f4f301adda937fa871eaa85b5cdb43cb6abbd70fd672d84a7" Dec 01 15:32:37 crc kubenswrapper[4931]: I1201 15:32:37.526444 4931 scope.go:117] "RemoveContainer" containerID="c70d69ad7753d80d60411aa595bb9f7b0851229918227cbb1c8d9e8a83744c6e" Dec 01 15:32:37 crc kubenswrapper[4931]: I1201 15:32:37.546092 4931 scope.go:117] "RemoveContainer" containerID="dfd83b7987253e4681798032ce7334d3561da9fe416603ef3705f8e2992b8127" Dec 01 15:32:37 crc kubenswrapper[4931]: I1201 15:32:37.575296 4931 scope.go:117] "RemoveContainer" containerID="39e62703271991718440019903e638e081e4adde592492a519255a8d8170bb5a" Dec 01 15:32:37 crc kubenswrapper[4931]: I1201 15:32:37.607640 4931 scope.go:117] "RemoveContainer" containerID="efd0fbd715944878f42fa752e4703c60d7935541b9bf9ff5956bcf760b28fb98" Dec 01 15:32:37 crc kubenswrapper[4931]: I1201 15:32:37.629216 4931 scope.go:117] "RemoveContainer" containerID="051bac8998ca5294a694a5c6e7d55760e12b3361336bb9ed30ffe90520719374" Dec 01 15:32:37 crc kubenswrapper[4931]: I1201 15:32:37.647886 4931 scope.go:117] "RemoveContainer" containerID="78d7e0fc026beed159aacf1e748a609d7a96ba0569f8978e00d276915fac7a97" Dec 01 15:32:45 crc kubenswrapper[4931]: I1201 15:32:45.241728 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:32:45 crc kubenswrapper[4931]: E1201 15:32:45.242590 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:33:00 crc kubenswrapper[4931]: I1201 15:33:00.242995 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:33:00 crc kubenswrapper[4931]: E1201 15:33:00.243881 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:33:09 crc kubenswrapper[4931]: I1201 15:33:09.056513 4931 generic.go:334] "Generic (PLEG): container finished" podID="c09d1f20-7083-4a46-bb55-734481a5d66c" containerID="fa77d3bba59ed79c184deaa6f085ea27845bf871c1a3cb7dc9de83af30b7fefe" exitCode=0 Dec 01 15:33:09 crc kubenswrapper[4931]: I1201 15:33:09.056582 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" event={"ID":"c09d1f20-7083-4a46-bb55-734481a5d66c","Type":"ContainerDied","Data":"fa77d3bba59ed79c184deaa6f085ea27845bf871c1a3cb7dc9de83af30b7fefe"} Dec 01 15:33:10 crc kubenswrapper[4931]: I1201 15:33:10.484187 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" Dec 01 15:33:10 crc kubenswrapper[4931]: I1201 15:33:10.549354 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c09d1f20-7083-4a46-bb55-734481a5d66c-ssh-key\") pod \"c09d1f20-7083-4a46-bb55-734481a5d66c\" (UID: \"c09d1f20-7083-4a46-bb55-734481a5d66c\") " Dec 01 15:33:10 crc kubenswrapper[4931]: I1201 15:33:10.549672 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbvmx\" (UniqueName: \"kubernetes.io/projected/c09d1f20-7083-4a46-bb55-734481a5d66c-kube-api-access-bbvmx\") pod \"c09d1f20-7083-4a46-bb55-734481a5d66c\" (UID: \"c09d1f20-7083-4a46-bb55-734481a5d66c\") " Dec 01 15:33:10 crc kubenswrapper[4931]: I1201 15:33:10.549793 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09d1f20-7083-4a46-bb55-734481a5d66c-inventory\") pod \"c09d1f20-7083-4a46-bb55-734481a5d66c\" (UID: \"c09d1f20-7083-4a46-bb55-734481a5d66c\") " Dec 01 15:33:10 crc kubenswrapper[4931]: I1201 15:33:10.555099 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09d1f20-7083-4a46-bb55-734481a5d66c-kube-api-access-bbvmx" (OuterVolumeSpecName: "kube-api-access-bbvmx") pod "c09d1f20-7083-4a46-bb55-734481a5d66c" (UID: "c09d1f20-7083-4a46-bb55-734481a5d66c"). InnerVolumeSpecName "kube-api-access-bbvmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:33:10 crc kubenswrapper[4931]: I1201 15:33:10.574532 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09d1f20-7083-4a46-bb55-734481a5d66c-inventory" (OuterVolumeSpecName: "inventory") pod "c09d1f20-7083-4a46-bb55-734481a5d66c" (UID: "c09d1f20-7083-4a46-bb55-734481a5d66c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:33:10 crc kubenswrapper[4931]: I1201 15:33:10.584040 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09d1f20-7083-4a46-bb55-734481a5d66c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c09d1f20-7083-4a46-bb55-734481a5d66c" (UID: "c09d1f20-7083-4a46-bb55-734481a5d66c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:33:10 crc kubenswrapper[4931]: I1201 15:33:10.651538 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09d1f20-7083-4a46-bb55-734481a5d66c-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:33:10 crc kubenswrapper[4931]: I1201 15:33:10.651568 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c09d1f20-7083-4a46-bb55-734481a5d66c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:33:10 crc kubenswrapper[4931]: I1201 15:33:10.651578 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbvmx\" (UniqueName: \"kubernetes.io/projected/c09d1f20-7083-4a46-bb55-734481a5d66c-kube-api-access-bbvmx\") on node \"crc\" DevicePath \"\"" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.080040 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" event={"ID":"c09d1f20-7083-4a46-bb55-734481a5d66c","Type":"ContainerDied","Data":"9fef2dda045d4d8b51a328aa4d3b794cb0448e198ec68385f8443eb4b1c53c60"} Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.080374 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fef2dda045d4d8b51a328aa4d3b794cb0448e198ec68385f8443eb4b1c53c60" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.080444 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-476jq" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.145250 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4"] Dec 01 15:33:11 crc kubenswrapper[4931]: E1201 15:33:11.145776 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09d1f20-7083-4a46-bb55-734481a5d66c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.145802 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09d1f20-7083-4a46-bb55-734481a5d66c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.146046 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09d1f20-7083-4a46-bb55-734481a5d66c" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.146835 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.149417 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.149468 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.149647 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.149691 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.156787 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4"] Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.241150 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:33:11 crc kubenswrapper[4931]: E1201 15:33:11.241514 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:33:11 crc kubenswrapper[4931]: E1201 15:33:11.262369 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc09d1f20_7083_4a46_bb55_734481a5d66c.slice/crio-9fef2dda045d4d8b51a328aa4d3b794cb0448e198ec68385f8443eb4b1c53c60\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc09d1f20_7083_4a46_bb55_734481a5d66c.slice\": RecentStats: unable to find data in memory cache]" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.262682 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4\" (UID: \"ae337b8b-ad01-493f-9471-aec15d221507\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.262879 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plplz\" (UniqueName: \"kubernetes.io/projected/ae337b8b-ad01-493f-9471-aec15d221507-kube-api-access-plplz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4\" (UID: \"ae337b8b-ad01-493f-9471-aec15d221507\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.262929 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4\" (UID: \"ae337b8b-ad01-493f-9471-aec15d221507\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.364537 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plplz\" (UniqueName: \"kubernetes.io/projected/ae337b8b-ad01-493f-9471-aec15d221507-kube-api-access-plplz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4\" (UID: \"ae337b8b-ad01-493f-9471-aec15d221507\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.364591 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4\" (UID: \"ae337b8b-ad01-493f-9471-aec15d221507\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.364644 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4\" (UID: \"ae337b8b-ad01-493f-9471-aec15d221507\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.369156 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4\" (UID: \"ae337b8b-ad01-493f-9471-aec15d221507\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.369360 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4\" (UID: \"ae337b8b-ad01-493f-9471-aec15d221507\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.380783 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plplz\" (UniqueName: \"kubernetes.io/projected/ae337b8b-ad01-493f-9471-aec15d221507-kube-api-access-plplz\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4\" (UID: \"ae337b8b-ad01-493f-9471-aec15d221507\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.428160 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-7465544595-sc668" podUID="2b1d2c6e-39e6-438c-98e8-be76bfa71050" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 01 15:33:11 crc kubenswrapper[4931]: I1201 15:33:11.464113 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" Dec 01 15:33:12 crc kubenswrapper[4931]: I1201 15:33:12.018481 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4"] Dec 01 15:33:12 crc kubenswrapper[4931]: I1201 15:33:12.088266 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" event={"ID":"ae337b8b-ad01-493f-9471-aec15d221507","Type":"ContainerStarted","Data":"6671661b67661fb66733cc013bbb5457887e49319e160b0e1400989bef8f7cd6"} Dec 01 15:33:13 crc kubenswrapper[4931]: I1201 15:33:13.040435 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-t2sgb"] Dec 01 15:33:13 crc kubenswrapper[4931]: I1201 15:33:13.049671 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-t2sgb"] Dec 01 15:33:13 crc kubenswrapper[4931]: I1201 15:33:13.098145 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" event={"ID":"ae337b8b-ad01-493f-9471-aec15d221507","Type":"ContainerStarted","Data":"c48cd33832be78a0f704b71f0623a68989eb270bb4ad9115193d305a40ef2544"} Dec 01 15:33:13 crc kubenswrapper[4931]: I1201 15:33:13.118802 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" podStartSLOduration=1.447172938 podStartE2EDuration="2.118781261s" podCreationTimestamp="2025-12-01 15:33:11 +0000 UTC" firstStartedPulling="2025-12-01 15:33:12.031786009 +0000 UTC m=+1938.457659676" lastFinishedPulling="2025-12-01 15:33:12.703394332 +0000 UTC m=+1939.129267999" observedRunningTime="2025-12-01 15:33:13.114476347 +0000 UTC m=+1939.540350024" watchObservedRunningTime="2025-12-01 15:33:13.118781261 +0000 UTC m=+1939.544654928" Dec 01 15:33:14 crc kubenswrapper[4931]: I1201 15:33:14.251203 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da4dc84-4773-4ff6-878d-e042e108cb65" path="/var/lib/kubelet/pods/6da4dc84-4773-4ff6-878d-e042e108cb65/volumes" Dec 01 15:33:23 crc kubenswrapper[4931]: I1201 15:33:23.241615 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:33:23 crc kubenswrapper[4931]: E1201 15:33:23.242565 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:33:36 crc kubenswrapper[4931]: I1201 15:33:36.242008 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:33:36 crc kubenswrapper[4931]: E1201 15:33:36.242798 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:33:37 crc kubenswrapper[4931]: I1201 15:33:37.849908 4931 scope.go:117] "RemoveContainer" containerID="75d95c7c0b9d6b902491f9421e3295bea94221c088887f27ab37c1ec59319e65" Dec 01 15:33:47 crc kubenswrapper[4931]: I1201 15:33:47.241513 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:33:47 crc kubenswrapper[4931]: E1201 15:33:47.242236 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:34:00 crc kubenswrapper[4931]: I1201 15:34:00.241589 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:34:00 crc kubenswrapper[4931]: I1201 15:34:00.497758 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"98d69a0489f57745ef5af73760a1c1eabe377307f549877e60e78dd8e543b8a4"} Dec 01 15:34:05 crc kubenswrapper[4931]: I1201 15:34:05.537183 4931 generic.go:334] "Generic (PLEG): container finished" podID="ae337b8b-ad01-493f-9471-aec15d221507" containerID="c48cd33832be78a0f704b71f0623a68989eb270bb4ad9115193d305a40ef2544" exitCode=0 Dec 01 15:34:05 crc kubenswrapper[4931]: I1201 15:34:05.537246 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" event={"ID":"ae337b8b-ad01-493f-9471-aec15d221507","Type":"ContainerDied","Data":"c48cd33832be78a0f704b71f0623a68989eb270bb4ad9115193d305a40ef2544"} Dec 01 15:34:06 crc kubenswrapper[4931]: I1201 15:34:06.938559 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.045270 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-ssh-key\") pod \"ae337b8b-ad01-493f-9471-aec15d221507\" (UID: \"ae337b8b-ad01-493f-9471-aec15d221507\") " Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.045439 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-inventory\") pod \"ae337b8b-ad01-493f-9471-aec15d221507\" (UID: \"ae337b8b-ad01-493f-9471-aec15d221507\") " Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.045673 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plplz\" (UniqueName: \"kubernetes.io/projected/ae337b8b-ad01-493f-9471-aec15d221507-kube-api-access-plplz\") pod \"ae337b8b-ad01-493f-9471-aec15d221507\" (UID: \"ae337b8b-ad01-493f-9471-aec15d221507\") " Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.050699 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae337b8b-ad01-493f-9471-aec15d221507-kube-api-access-plplz" (OuterVolumeSpecName: "kube-api-access-plplz") pod "ae337b8b-ad01-493f-9471-aec15d221507" (UID: "ae337b8b-ad01-493f-9471-aec15d221507"). InnerVolumeSpecName "kube-api-access-plplz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:34:07 crc kubenswrapper[4931]: E1201 15:34:07.069206 4931 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-inventory podName:ae337b8b-ad01-493f-9471-aec15d221507 nodeName:}" failed. No retries permitted until 2025-12-01 15:34:07.569181245 +0000 UTC m=+1993.995054912 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-inventory") pod "ae337b8b-ad01-493f-9471-aec15d221507" (UID: "ae337b8b-ad01-493f-9471-aec15d221507") : error deleting /var/lib/kubelet/pods/ae337b8b-ad01-493f-9471-aec15d221507/volume-subpaths: remove /var/lib/kubelet/pods/ae337b8b-ad01-493f-9471-aec15d221507/volume-subpaths: no such file or directory Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.071507 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ae337b8b-ad01-493f-9471-aec15d221507" (UID: "ae337b8b-ad01-493f-9471-aec15d221507"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.148133 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.148175 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plplz\" (UniqueName: \"kubernetes.io/projected/ae337b8b-ad01-493f-9471-aec15d221507-kube-api-access-plplz\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.562492 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" event={"ID":"ae337b8b-ad01-493f-9471-aec15d221507","Type":"ContainerDied","Data":"6671661b67661fb66733cc013bbb5457887e49319e160b0e1400989bef8f7cd6"} Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.562532 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6671661b67661fb66733cc013bbb5457887e49319e160b0e1400989bef8f7cd6" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.562553 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.645212 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-f4bm9"] Dec 01 15:34:07 crc kubenswrapper[4931]: E1201 15:34:07.645766 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae337b8b-ad01-493f-9471-aec15d221507" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.645795 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae337b8b-ad01-493f-9471-aec15d221507" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.646039 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae337b8b-ad01-493f-9471-aec15d221507" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.646836 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.655570 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-inventory\") pod \"ae337b8b-ad01-493f-9471-aec15d221507\" (UID: \"ae337b8b-ad01-493f-9471-aec15d221507\") " Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.660193 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-inventory" (OuterVolumeSpecName: "inventory") pod "ae337b8b-ad01-493f-9471-aec15d221507" (UID: "ae337b8b-ad01-493f-9471-aec15d221507"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.662968 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-f4bm9"] Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.757228 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-f4bm9\" (UID: \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\") " pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.757549 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-f4bm9\" (UID: \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\") " pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.757582 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnq6t\" (UniqueName: \"kubernetes.io/projected/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-kube-api-access-pnq6t\") pod \"ssh-known-hosts-edpm-deployment-f4bm9\" (UID: \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\") " pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.757654 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae337b8b-ad01-493f-9471-aec15d221507-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.859001 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-f4bm9\" (UID: \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\") " pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.859065 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnq6t\" (UniqueName: \"kubernetes.io/projected/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-kube-api-access-pnq6t\") pod \"ssh-known-hosts-edpm-deployment-f4bm9\" (UID: \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\") " pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.859161 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-f4bm9\" (UID: \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\") " pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.864692 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-f4bm9\" (UID: \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\") " pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.865460 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-f4bm9\" (UID: \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\") " pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" Dec 01 15:34:07 crc kubenswrapper[4931]: I1201 15:34:07.876849 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnq6t\" (UniqueName: \"kubernetes.io/projected/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-kube-api-access-pnq6t\") pod \"ssh-known-hosts-edpm-deployment-f4bm9\" (UID: \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\") " pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" Dec 01 15:34:08 crc kubenswrapper[4931]: I1201 15:34:08.018480 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" Dec 01 15:34:08 crc kubenswrapper[4931]: I1201 15:34:08.521932 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-f4bm9"] Dec 01 15:34:08 crc kubenswrapper[4931]: I1201 15:34:08.574545 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" event={"ID":"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc","Type":"ContainerStarted","Data":"e5973ecada42b320b728bb9005d15bcd58d9b97c33c0aa7b22ef53f75a8edb80"} Dec 01 15:34:09 crc kubenswrapper[4931]: I1201 15:34:09.586096 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" event={"ID":"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc","Type":"ContainerStarted","Data":"5b2af016852b943084cd6fae9512980f4230d193a5179dff2e52bdc47189d57a"} Dec 01 15:34:09 crc kubenswrapper[4931]: I1201 15:34:09.610220 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" podStartSLOduration=2.172843525 podStartE2EDuration="2.610165434s" podCreationTimestamp="2025-12-01 15:34:07 +0000 UTC" firstStartedPulling="2025-12-01 15:34:08.525437918 +0000 UTC m=+1994.951311585" lastFinishedPulling="2025-12-01 15:34:08.962759827 +0000 UTC m=+1995.388633494" observedRunningTime="2025-12-01 15:34:09.604156111 +0000 UTC m=+1996.030029778" watchObservedRunningTime="2025-12-01 15:34:09.610165434 +0000 UTC m=+1996.036039111" Dec 01 15:34:13 crc kubenswrapper[4931]: I1201 15:34:13.422836 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9flrm"] Dec 01 15:34:13 crc kubenswrapper[4931]: I1201 15:34:13.426001 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:13 crc kubenswrapper[4931]: I1201 15:34:13.443657 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9flrm"] Dec 01 15:34:13 crc kubenswrapper[4931]: I1201 15:34:13.560879 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c056caa-225d-4d5a-8712-7a09a99a2887-catalog-content\") pod \"certified-operators-9flrm\" (UID: \"0c056caa-225d-4d5a-8712-7a09a99a2887\") " pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:13 crc kubenswrapper[4931]: I1201 15:34:13.560965 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c056caa-225d-4d5a-8712-7a09a99a2887-utilities\") pod \"certified-operators-9flrm\" (UID: \"0c056caa-225d-4d5a-8712-7a09a99a2887\") " pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:13 crc kubenswrapper[4931]: I1201 15:34:13.561484 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzn9l\" (UniqueName: \"kubernetes.io/projected/0c056caa-225d-4d5a-8712-7a09a99a2887-kube-api-access-mzn9l\") pod \"certified-operators-9flrm\" (UID: \"0c056caa-225d-4d5a-8712-7a09a99a2887\") " pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:13 crc kubenswrapper[4931]: I1201 15:34:13.702890 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzn9l\" (UniqueName: \"kubernetes.io/projected/0c056caa-225d-4d5a-8712-7a09a99a2887-kube-api-access-mzn9l\") pod \"certified-operators-9flrm\" (UID: \"0c056caa-225d-4d5a-8712-7a09a99a2887\") " pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:13 crc kubenswrapper[4931]: I1201 15:34:13.703065 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c056caa-225d-4d5a-8712-7a09a99a2887-catalog-content\") pod \"certified-operators-9flrm\" (UID: \"0c056caa-225d-4d5a-8712-7a09a99a2887\") " pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:13 crc kubenswrapper[4931]: I1201 15:34:13.703121 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c056caa-225d-4d5a-8712-7a09a99a2887-utilities\") pod \"certified-operators-9flrm\" (UID: \"0c056caa-225d-4d5a-8712-7a09a99a2887\") " pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:13 crc kubenswrapper[4931]: I1201 15:34:13.703668 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c056caa-225d-4d5a-8712-7a09a99a2887-utilities\") pod \"certified-operators-9flrm\" (UID: \"0c056caa-225d-4d5a-8712-7a09a99a2887\") " pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:13 crc kubenswrapper[4931]: I1201 15:34:13.703805 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c056caa-225d-4d5a-8712-7a09a99a2887-catalog-content\") pod \"certified-operators-9flrm\" (UID: \"0c056caa-225d-4d5a-8712-7a09a99a2887\") " pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:13 crc kubenswrapper[4931]: I1201 15:34:13.722605 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzn9l\" (UniqueName: \"kubernetes.io/projected/0c056caa-225d-4d5a-8712-7a09a99a2887-kube-api-access-mzn9l\") pod \"certified-operators-9flrm\" (UID: \"0c056caa-225d-4d5a-8712-7a09a99a2887\") " pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:13 crc kubenswrapper[4931]: I1201 15:34:13.755716 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:14 crc kubenswrapper[4931]: I1201 15:34:14.267853 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9flrm"] Dec 01 15:34:14 crc kubenswrapper[4931]: W1201 15:34:14.275293 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c056caa_225d_4d5a_8712_7a09a99a2887.slice/crio-9fd1eee9ec78dfe3ec0081ed11811c10c685afa05959bf41824bd62121d7f3e2 WatchSource:0}: Error finding container 9fd1eee9ec78dfe3ec0081ed11811c10c685afa05959bf41824bd62121d7f3e2: Status 404 returned error can't find the container with id 9fd1eee9ec78dfe3ec0081ed11811c10c685afa05959bf41824bd62121d7f3e2 Dec 01 15:34:14 crc kubenswrapper[4931]: I1201 15:34:14.624943 4931 generic.go:334] "Generic (PLEG): container finished" podID="0c056caa-225d-4d5a-8712-7a09a99a2887" containerID="d2f68f90088306e602d032cddc58cbf20e1e4feda4e5107fa7b5d6fff1b8f933" exitCode=0 Dec 01 15:34:14 crc kubenswrapper[4931]: I1201 15:34:14.625002 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9flrm" event={"ID":"0c056caa-225d-4d5a-8712-7a09a99a2887","Type":"ContainerDied","Data":"d2f68f90088306e602d032cddc58cbf20e1e4feda4e5107fa7b5d6fff1b8f933"} Dec 01 15:34:14 crc kubenswrapper[4931]: I1201 15:34:14.625029 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9flrm" event={"ID":"0c056caa-225d-4d5a-8712-7a09a99a2887","Type":"ContainerStarted","Data":"9fd1eee9ec78dfe3ec0081ed11811c10c685afa05959bf41824bd62121d7f3e2"} Dec 01 15:34:16 crc kubenswrapper[4931]: I1201 15:34:16.643690 4931 generic.go:334] "Generic (PLEG): container finished" podID="04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc" containerID="5b2af016852b943084cd6fae9512980f4230d193a5179dff2e52bdc47189d57a" exitCode=0 Dec 01 15:34:16 crc kubenswrapper[4931]: I1201 15:34:16.643758 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" event={"ID":"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc","Type":"ContainerDied","Data":"5b2af016852b943084cd6fae9512980f4230d193a5179dff2e52bdc47189d57a"} Dec 01 15:34:17 crc kubenswrapper[4931]: I1201 15:34:17.653962 4931 generic.go:334] "Generic (PLEG): container finished" podID="0c056caa-225d-4d5a-8712-7a09a99a2887" containerID="27c4eca0dbd407d9092d547ed34454d6371adc589c031f5d267992f914cc751e" exitCode=0 Dec 01 15:34:17 crc kubenswrapper[4931]: I1201 15:34:17.654052 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9flrm" event={"ID":"0c056caa-225d-4d5a-8712-7a09a99a2887","Type":"ContainerDied","Data":"27c4eca0dbd407d9092d547ed34454d6371adc589c031f5d267992f914cc751e"} Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.043108 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.187615 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-inventory-0\") pod \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\" (UID: \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\") " Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.187704 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnq6t\" (UniqueName: \"kubernetes.io/projected/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-kube-api-access-pnq6t\") pod \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\" (UID: \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\") " Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.187762 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-ssh-key-openstack-edpm-ipam\") pod \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\" (UID: \"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc\") " Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.193262 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-kube-api-access-pnq6t" (OuterVolumeSpecName: "kube-api-access-pnq6t") pod "04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc" (UID: "04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc"). InnerVolumeSpecName "kube-api-access-pnq6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.219196 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc" (UID: "04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.220299 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc" (UID: "04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.289849 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnq6t\" (UniqueName: \"kubernetes.io/projected/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-kube-api-access-pnq6t\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.289890 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.289901 4931 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.665803 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.665802 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-f4bm9" event={"ID":"04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc","Type":"ContainerDied","Data":"e5973ecada42b320b728bb9005d15bcd58d9b97c33c0aa7b22ef53f75a8edb80"} Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.666346 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5973ecada42b320b728bb9005d15bcd58d9b97c33c0aa7b22ef53f75a8edb80" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.668204 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9flrm" event={"ID":"0c056caa-225d-4d5a-8712-7a09a99a2887","Type":"ContainerStarted","Data":"804d526418d85a3a8f875a788936985449b596a07901342516024af74f0e79ef"} Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.707928 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9flrm" podStartSLOduration=2.015316512 podStartE2EDuration="5.707901429s" podCreationTimestamp="2025-12-01 15:34:13 +0000 UTC" firstStartedPulling="2025-12-01 15:34:14.626783005 +0000 UTC m=+2001.052656672" lastFinishedPulling="2025-12-01 15:34:18.319367922 +0000 UTC m=+2004.745241589" observedRunningTime="2025-12-01 15:34:18.689721996 +0000 UTC m=+2005.115595673" watchObservedRunningTime="2025-12-01 15:34:18.707901429 +0000 UTC m=+2005.133775096" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.747993 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm"] Dec 01 15:34:18 crc kubenswrapper[4931]: E1201 15:34:18.748511 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc" containerName="ssh-known-hosts-edpm-deployment" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.748539 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc" containerName="ssh-known-hosts-edpm-deployment" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.748745 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc" containerName="ssh-known-hosts-edpm-deployment" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.749408 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.752133 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.752437 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.752891 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.753317 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.759662 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm"] Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.901887 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kqxsm\" (UID: \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.902024 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjr4c\" (UniqueName: \"kubernetes.io/projected/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-kube-api-access-pjr4c\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kqxsm\" (UID: \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" Dec 01 15:34:18 crc kubenswrapper[4931]: I1201 15:34:18.902214 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kqxsm\" (UID: \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" Dec 01 15:34:19 crc kubenswrapper[4931]: I1201 15:34:19.004112 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kqxsm\" (UID: \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" Dec 01 15:34:19 crc kubenswrapper[4931]: I1201 15:34:19.004210 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjr4c\" (UniqueName: \"kubernetes.io/projected/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-kube-api-access-pjr4c\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kqxsm\" (UID: \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" Dec 01 15:34:19 crc kubenswrapper[4931]: I1201 15:34:19.004248 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kqxsm\" (UID: \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" Dec 01 15:34:19 crc kubenswrapper[4931]: I1201 15:34:19.011130 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kqxsm\" (UID: \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" Dec 01 15:34:19 crc kubenswrapper[4931]: I1201 15:34:19.011935 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kqxsm\" (UID: \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" Dec 01 15:34:19 crc kubenswrapper[4931]: I1201 15:34:19.021960 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjr4c\" (UniqueName: \"kubernetes.io/projected/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-kube-api-access-pjr4c\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-kqxsm\" (UID: \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" Dec 01 15:34:19 crc kubenswrapper[4931]: I1201 15:34:19.070216 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" Dec 01 15:34:19 crc kubenswrapper[4931]: I1201 15:34:19.589565 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm"] Dec 01 15:34:19 crc kubenswrapper[4931]: I1201 15:34:19.676366 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" event={"ID":"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67","Type":"ContainerStarted","Data":"a623b9a148b8ecbc4d3426b8c34d92ec4c4d55e064d794f1fbad2663716d75dc"} Dec 01 15:34:20 crc kubenswrapper[4931]: I1201 15:34:20.687451 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" event={"ID":"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67","Type":"ContainerStarted","Data":"8e7d0c075fd93a94a644c1786ad2c8cad678778150a37c45bbf9897270dda89e"} Dec 01 15:34:20 crc kubenswrapper[4931]: I1201 15:34:20.706035 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" podStartSLOduration=2.239716455 podStartE2EDuration="2.706018187s" podCreationTimestamp="2025-12-01 15:34:18 +0000 UTC" firstStartedPulling="2025-12-01 15:34:19.592550665 +0000 UTC m=+2006.018424332" lastFinishedPulling="2025-12-01 15:34:20.058852377 +0000 UTC m=+2006.484726064" observedRunningTime="2025-12-01 15:34:20.705374948 +0000 UTC m=+2007.131248615" watchObservedRunningTime="2025-12-01 15:34:20.706018187 +0000 UTC m=+2007.131891854" Dec 01 15:34:23 crc kubenswrapper[4931]: I1201 15:34:23.755934 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:23 crc kubenswrapper[4931]: I1201 15:34:23.756507 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:23 crc kubenswrapper[4931]: I1201 15:34:23.804034 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:24 crc kubenswrapper[4931]: I1201 15:34:24.774290 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:24 crc kubenswrapper[4931]: I1201 15:34:24.821575 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9flrm"] Dec 01 15:34:26 crc kubenswrapper[4931]: I1201 15:34:26.740998 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9flrm" podUID="0c056caa-225d-4d5a-8712-7a09a99a2887" containerName="registry-server" containerID="cri-o://804d526418d85a3a8f875a788936985449b596a07901342516024af74f0e79ef" gracePeriod=2 Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.385619 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.558485 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c056caa-225d-4d5a-8712-7a09a99a2887-catalog-content\") pod \"0c056caa-225d-4d5a-8712-7a09a99a2887\" (UID: \"0c056caa-225d-4d5a-8712-7a09a99a2887\") " Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.558552 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c056caa-225d-4d5a-8712-7a09a99a2887-utilities\") pod \"0c056caa-225d-4d5a-8712-7a09a99a2887\" (UID: \"0c056caa-225d-4d5a-8712-7a09a99a2887\") " Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.558861 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzn9l\" (UniqueName: \"kubernetes.io/projected/0c056caa-225d-4d5a-8712-7a09a99a2887-kube-api-access-mzn9l\") pod \"0c056caa-225d-4d5a-8712-7a09a99a2887\" (UID: \"0c056caa-225d-4d5a-8712-7a09a99a2887\") " Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.559713 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c056caa-225d-4d5a-8712-7a09a99a2887-utilities" (OuterVolumeSpecName: "utilities") pod "0c056caa-225d-4d5a-8712-7a09a99a2887" (UID: "0c056caa-225d-4d5a-8712-7a09a99a2887"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.564731 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c056caa-225d-4d5a-8712-7a09a99a2887-kube-api-access-mzn9l" (OuterVolumeSpecName: "kube-api-access-mzn9l") pod "0c056caa-225d-4d5a-8712-7a09a99a2887" (UID: "0c056caa-225d-4d5a-8712-7a09a99a2887"). InnerVolumeSpecName "kube-api-access-mzn9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.612101 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c056caa-225d-4d5a-8712-7a09a99a2887-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c056caa-225d-4d5a-8712-7a09a99a2887" (UID: "0c056caa-225d-4d5a-8712-7a09a99a2887"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.661156 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c056caa-225d-4d5a-8712-7a09a99a2887-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.661190 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzn9l\" (UniqueName: \"kubernetes.io/projected/0c056caa-225d-4d5a-8712-7a09a99a2887-kube-api-access-mzn9l\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.661201 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c056caa-225d-4d5a-8712-7a09a99a2887-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.754146 4931 generic.go:334] "Generic (PLEG): container finished" podID="0c056caa-225d-4d5a-8712-7a09a99a2887" containerID="804d526418d85a3a8f875a788936985449b596a07901342516024af74f0e79ef" exitCode=0 Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.754186 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9flrm" event={"ID":"0c056caa-225d-4d5a-8712-7a09a99a2887","Type":"ContainerDied","Data":"804d526418d85a3a8f875a788936985449b596a07901342516024af74f0e79ef"} Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.754245 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9flrm" event={"ID":"0c056caa-225d-4d5a-8712-7a09a99a2887","Type":"ContainerDied","Data":"9fd1eee9ec78dfe3ec0081ed11811c10c685afa05959bf41824bd62121d7f3e2"} Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.754261 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9flrm" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.754271 4931 scope.go:117] "RemoveContainer" containerID="804d526418d85a3a8f875a788936985449b596a07901342516024af74f0e79ef" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.777344 4931 scope.go:117] "RemoveContainer" containerID="27c4eca0dbd407d9092d547ed34454d6371adc589c031f5d267992f914cc751e" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.789161 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9flrm"] Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.796519 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9flrm"] Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.817652 4931 scope.go:117] "RemoveContainer" containerID="d2f68f90088306e602d032cddc58cbf20e1e4feda4e5107fa7b5d6fff1b8f933" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.846716 4931 scope.go:117] "RemoveContainer" containerID="804d526418d85a3a8f875a788936985449b596a07901342516024af74f0e79ef" Dec 01 15:34:27 crc kubenswrapper[4931]: E1201 15:34:27.847375 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"804d526418d85a3a8f875a788936985449b596a07901342516024af74f0e79ef\": container with ID starting with 804d526418d85a3a8f875a788936985449b596a07901342516024af74f0e79ef not found: ID does not exist" containerID="804d526418d85a3a8f875a788936985449b596a07901342516024af74f0e79ef" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.847513 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"804d526418d85a3a8f875a788936985449b596a07901342516024af74f0e79ef"} err="failed to get container status \"804d526418d85a3a8f875a788936985449b596a07901342516024af74f0e79ef\": rpc error: code = NotFound desc = could not find container \"804d526418d85a3a8f875a788936985449b596a07901342516024af74f0e79ef\": container with ID starting with 804d526418d85a3a8f875a788936985449b596a07901342516024af74f0e79ef not found: ID does not exist" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.847543 4931 scope.go:117] "RemoveContainer" containerID="27c4eca0dbd407d9092d547ed34454d6371adc589c031f5d267992f914cc751e" Dec 01 15:34:27 crc kubenswrapper[4931]: E1201 15:34:27.847934 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27c4eca0dbd407d9092d547ed34454d6371adc589c031f5d267992f914cc751e\": container with ID starting with 27c4eca0dbd407d9092d547ed34454d6371adc589c031f5d267992f914cc751e not found: ID does not exist" containerID="27c4eca0dbd407d9092d547ed34454d6371adc589c031f5d267992f914cc751e" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.847996 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27c4eca0dbd407d9092d547ed34454d6371adc589c031f5d267992f914cc751e"} err="failed to get container status \"27c4eca0dbd407d9092d547ed34454d6371adc589c031f5d267992f914cc751e\": rpc error: code = NotFound desc = could not find container \"27c4eca0dbd407d9092d547ed34454d6371adc589c031f5d267992f914cc751e\": container with ID starting with 27c4eca0dbd407d9092d547ed34454d6371adc589c031f5d267992f914cc751e not found: ID does not exist" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.848035 4931 scope.go:117] "RemoveContainer" containerID="d2f68f90088306e602d032cddc58cbf20e1e4feda4e5107fa7b5d6fff1b8f933" Dec 01 15:34:27 crc kubenswrapper[4931]: E1201 15:34:27.848628 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2f68f90088306e602d032cddc58cbf20e1e4feda4e5107fa7b5d6fff1b8f933\": container with ID starting with d2f68f90088306e602d032cddc58cbf20e1e4feda4e5107fa7b5d6fff1b8f933 not found: ID does not exist" containerID="d2f68f90088306e602d032cddc58cbf20e1e4feda4e5107fa7b5d6fff1b8f933" Dec 01 15:34:27 crc kubenswrapper[4931]: I1201 15:34:27.848666 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2f68f90088306e602d032cddc58cbf20e1e4feda4e5107fa7b5d6fff1b8f933"} err="failed to get container status \"d2f68f90088306e602d032cddc58cbf20e1e4feda4e5107fa7b5d6fff1b8f933\": rpc error: code = NotFound desc = could not find container \"d2f68f90088306e602d032cddc58cbf20e1e4feda4e5107fa7b5d6fff1b8f933\": container with ID starting with d2f68f90088306e602d032cddc58cbf20e1e4feda4e5107fa7b5d6fff1b8f933 not found: ID does not exist" Dec 01 15:34:28 crc kubenswrapper[4931]: I1201 15:34:28.284823 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c056caa-225d-4d5a-8712-7a09a99a2887" path="/var/lib/kubelet/pods/0c056caa-225d-4d5a-8712-7a09a99a2887/volumes" Dec 01 15:34:30 crc kubenswrapper[4931]: I1201 15:34:30.802993 4931 generic.go:334] "Generic (PLEG): container finished" podID="def3b4c2-cbcc-4aae-a750-6c96cd0d8f67" containerID="8e7d0c075fd93a94a644c1786ad2c8cad678778150a37c45bbf9897270dda89e" exitCode=0 Dec 01 15:34:30 crc kubenswrapper[4931]: I1201 15:34:30.803101 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" event={"ID":"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67","Type":"ContainerDied","Data":"8e7d0c075fd93a94a644c1786ad2c8cad678778150a37c45bbf9897270dda89e"} Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.202428 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.359969 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-ssh-key\") pod \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\" (UID: \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\") " Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.360499 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr4c\" (UniqueName: \"kubernetes.io/projected/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-kube-api-access-pjr4c\") pod \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\" (UID: \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\") " Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.360534 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-inventory\") pod \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\" (UID: \"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67\") " Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.371345 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-kube-api-access-pjr4c" (OuterVolumeSpecName: "kube-api-access-pjr4c") pod "def3b4c2-cbcc-4aae-a750-6c96cd0d8f67" (UID: "def3b4c2-cbcc-4aae-a750-6c96cd0d8f67"). InnerVolumeSpecName "kube-api-access-pjr4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.471749 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr4c\" (UniqueName: \"kubernetes.io/projected/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-kube-api-access-pjr4c\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.475416 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-inventory" (OuterVolumeSpecName: "inventory") pod "def3b4c2-cbcc-4aae-a750-6c96cd0d8f67" (UID: "def3b4c2-cbcc-4aae-a750-6c96cd0d8f67"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.478548 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "def3b4c2-cbcc-4aae-a750-6c96cd0d8f67" (UID: "def3b4c2-cbcc-4aae-a750-6c96cd0d8f67"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.573756 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.573792 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/def3b4c2-cbcc-4aae-a750-6c96cd0d8f67-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.828202 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" event={"ID":"def3b4c2-cbcc-4aae-a750-6c96cd0d8f67","Type":"ContainerDied","Data":"a623b9a148b8ecbc4d3426b8c34d92ec4c4d55e064d794f1fbad2663716d75dc"} Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.828231 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-kqxsm" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.828243 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a623b9a148b8ecbc4d3426b8c34d92ec4c4d55e064d794f1fbad2663716d75dc" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.886556 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4"] Dec 01 15:34:32 crc kubenswrapper[4931]: E1201 15:34:32.887007 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c056caa-225d-4d5a-8712-7a09a99a2887" containerName="extract-content" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.887032 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c056caa-225d-4d5a-8712-7a09a99a2887" containerName="extract-content" Dec 01 15:34:32 crc kubenswrapper[4931]: E1201 15:34:32.887049 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def3b4c2-cbcc-4aae-a750-6c96cd0d8f67" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.887059 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="def3b4c2-cbcc-4aae-a750-6c96cd0d8f67" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 15:34:32 crc kubenswrapper[4931]: E1201 15:34:32.887084 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c056caa-225d-4d5a-8712-7a09a99a2887" containerName="extract-utilities" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.887093 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c056caa-225d-4d5a-8712-7a09a99a2887" containerName="extract-utilities" Dec 01 15:34:32 crc kubenswrapper[4931]: E1201 15:34:32.887130 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c056caa-225d-4d5a-8712-7a09a99a2887" containerName="registry-server" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.887137 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c056caa-225d-4d5a-8712-7a09a99a2887" containerName="registry-server" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.887330 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="def3b4c2-cbcc-4aae-a750-6c96cd0d8f67" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.887362 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c056caa-225d-4d5a-8712-7a09a99a2887" containerName="registry-server" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.888128 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.899133 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.900468 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4"] Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.906540 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.906974 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.908033 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.983753 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbzkn\" (UniqueName: \"kubernetes.io/projected/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-kube-api-access-gbzkn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4\" (UID: \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.983829 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4\" (UID: \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" Dec 01 15:34:32 crc kubenswrapper[4931]: I1201 15:34:32.983904 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4\" (UID: \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" Dec 01 15:34:33 crc kubenswrapper[4931]: I1201 15:34:33.085313 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbzkn\" (UniqueName: \"kubernetes.io/projected/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-kube-api-access-gbzkn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4\" (UID: \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" Dec 01 15:34:33 crc kubenswrapper[4931]: I1201 15:34:33.085492 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4\" (UID: \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" Dec 01 15:34:33 crc kubenswrapper[4931]: I1201 15:34:33.085644 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4\" (UID: \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" Dec 01 15:34:33 crc kubenswrapper[4931]: I1201 15:34:33.091534 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4\" (UID: \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" Dec 01 15:34:33 crc kubenswrapper[4931]: I1201 15:34:33.092249 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4\" (UID: \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" Dec 01 15:34:33 crc kubenswrapper[4931]: I1201 15:34:33.106262 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbzkn\" (UniqueName: \"kubernetes.io/projected/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-kube-api-access-gbzkn\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4\" (UID: \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" Dec 01 15:34:33 crc kubenswrapper[4931]: I1201 15:34:33.213130 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" Dec 01 15:34:33 crc kubenswrapper[4931]: I1201 15:34:33.976282 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4"] Dec 01 15:34:33 crc kubenswrapper[4931]: I1201 15:34:33.982365 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 15:34:34 crc kubenswrapper[4931]: I1201 15:34:34.901875 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" event={"ID":"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c","Type":"ContainerStarted","Data":"e585920f18fa4b52373ecc707aaa96f083bb8d8d37fe7e61c008001f314970b6"} Dec 01 15:34:37 crc kubenswrapper[4931]: I1201 15:34:37.926813 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" event={"ID":"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c","Type":"ContainerStarted","Data":"1b3506cad899da8f53447094f7292aee5b3e049b19210726b5cbcc81572e0e62"} Dec 01 15:34:38 crc kubenswrapper[4931]: I1201 15:34:38.953115 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" podStartSLOduration=3.554676507 podStartE2EDuration="6.953093892s" podCreationTimestamp="2025-12-01 15:34:32 +0000 UTC" firstStartedPulling="2025-12-01 15:34:33.982160622 +0000 UTC m=+2020.408034289" lastFinishedPulling="2025-12-01 15:34:37.380578007 +0000 UTC m=+2023.806451674" observedRunningTime="2025-12-01 15:34:38.944936734 +0000 UTC m=+2025.370810401" watchObservedRunningTime="2025-12-01 15:34:38.953093892 +0000 UTC m=+2025.378967559" Dec 01 15:34:48 crc kubenswrapper[4931]: I1201 15:34:48.006218 4931 generic.go:334] "Generic (PLEG): container finished" podID="616e2b9d-1da1-44fd-98c3-7f8cbc8d686c" containerID="1b3506cad899da8f53447094f7292aee5b3e049b19210726b5cbcc81572e0e62" exitCode=0 Dec 01 15:34:48 crc kubenswrapper[4931]: I1201 15:34:48.006287 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" event={"ID":"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c","Type":"ContainerDied","Data":"1b3506cad899da8f53447094f7292aee5b3e049b19210726b5cbcc81572e0e62"} Dec 01 15:34:49 crc kubenswrapper[4931]: I1201 15:34:49.419992 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" Dec 01 15:34:49 crc kubenswrapper[4931]: I1201 15:34:49.475593 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-inventory\") pod \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\" (UID: \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\") " Dec 01 15:34:49 crc kubenswrapper[4931]: I1201 15:34:49.475662 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-ssh-key\") pod \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\" (UID: \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\") " Dec 01 15:34:49 crc kubenswrapper[4931]: I1201 15:34:49.475765 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbzkn\" (UniqueName: \"kubernetes.io/projected/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-kube-api-access-gbzkn\") pod \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\" (UID: \"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c\") " Dec 01 15:34:49 crc kubenswrapper[4931]: I1201 15:34:49.481590 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-kube-api-access-gbzkn" (OuterVolumeSpecName: "kube-api-access-gbzkn") pod "616e2b9d-1da1-44fd-98c3-7f8cbc8d686c" (UID: "616e2b9d-1da1-44fd-98c3-7f8cbc8d686c"). InnerVolumeSpecName "kube-api-access-gbzkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:34:49 crc kubenswrapper[4931]: I1201 15:34:49.499931 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "616e2b9d-1da1-44fd-98c3-7f8cbc8d686c" (UID: "616e2b9d-1da1-44fd-98c3-7f8cbc8d686c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:34:49 crc kubenswrapper[4931]: I1201 15:34:49.507659 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-inventory" (OuterVolumeSpecName: "inventory") pod "616e2b9d-1da1-44fd-98c3-7f8cbc8d686c" (UID: "616e2b9d-1da1-44fd-98c3-7f8cbc8d686c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:34:49 crc kubenswrapper[4931]: I1201 15:34:49.577960 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:49 crc kubenswrapper[4931]: I1201 15:34:49.578001 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:49 crc kubenswrapper[4931]: I1201 15:34:49.578014 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbzkn\" (UniqueName: \"kubernetes.io/projected/616e2b9d-1da1-44fd-98c3-7f8cbc8d686c-kube-api-access-gbzkn\") on node \"crc\" DevicePath \"\"" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.023641 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" event={"ID":"616e2b9d-1da1-44fd-98c3-7f8cbc8d686c","Type":"ContainerDied","Data":"e585920f18fa4b52373ecc707aaa96f083bb8d8d37fe7e61c008001f314970b6"} Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.023685 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e585920f18fa4b52373ecc707aaa96f083bb8d8d37fe7e61c008001f314970b6" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.023706 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.124971 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5"] Dec 01 15:34:50 crc kubenswrapper[4931]: E1201 15:34:50.127825 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616e2b9d-1da1-44fd-98c3-7f8cbc8d686c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.127853 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="616e2b9d-1da1-44fd-98c3-7f8cbc8d686c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.128216 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="616e2b9d-1da1-44fd-98c3-7f8cbc8d686c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.129421 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.132881 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.133143 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.133526 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.133904 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.134042 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.134064 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.134184 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.134200 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.135810 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5"] Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.190306 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.190408 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.190611 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.190668 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.190751 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.190771 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.190817 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.190843 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.190865 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.190926 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.191057 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv4kh\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-kube-api-access-pv4kh\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.191117 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.191260 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.191352 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.293380 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.293476 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.293524 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.293591 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.293625 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.293659 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.293682 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.293724 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.293743 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.293776 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.293817 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.293915 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv4kh\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-kube-api-access-pv4kh\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.293977 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.294017 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.299834 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.299991 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.300117 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.300801 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.301362 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.301473 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.301741 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.301876 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.302325 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.303132 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.303693 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.304591 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.305574 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.317371 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv4kh\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-kube-api-access-pv4kh\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z54c5\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:50 crc kubenswrapper[4931]: I1201 15:34:50.455541 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:34:51 crc kubenswrapper[4931]: I1201 15:34:51.001285 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5"] Dec 01 15:34:51 crc kubenswrapper[4931]: I1201 15:34:51.033594 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" event={"ID":"624618ca-ac0b-4fcf-bcf8-a7e744c98241","Type":"ContainerStarted","Data":"899b41b17153c38eec0f844bdbfc87a24ba7ed42ac4d465c06848ce9566df2d8"} Dec 01 15:34:52 crc kubenswrapper[4931]: I1201 15:34:52.042619 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" event={"ID":"624618ca-ac0b-4fcf-bcf8-a7e744c98241","Type":"ContainerStarted","Data":"b60697e8021dbe946859ae4f51e0ca2b139f8f7518ea325511c579338c477611"} Dec 01 15:34:52 crc kubenswrapper[4931]: I1201 15:34:52.064305 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" podStartSLOduration=1.5712514020000001 podStartE2EDuration="2.064282962s" podCreationTimestamp="2025-12-01 15:34:50 +0000 UTC" firstStartedPulling="2025-12-01 15:34:51.002543663 +0000 UTC m=+2037.428417330" lastFinishedPulling="2025-12-01 15:34:51.495575223 +0000 UTC m=+2037.921448890" observedRunningTime="2025-12-01 15:34:52.061627578 +0000 UTC m=+2038.487501255" watchObservedRunningTime="2025-12-01 15:34:52.064282962 +0000 UTC m=+2038.490156629" Dec 01 15:35:05 crc kubenswrapper[4931]: I1201 15:35:05.467357 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v6nm2"] Dec 01 15:35:05 crc kubenswrapper[4931]: I1201 15:35:05.470271 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:05 crc kubenswrapper[4931]: I1201 15:35:05.476909 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v6nm2"] Dec 01 15:35:05 crc kubenswrapper[4931]: I1201 15:35:05.651084 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36b884a-b808-4e6c-9db0-20a6d5d56df8-utilities\") pod \"redhat-operators-v6nm2\" (UID: \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\") " pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:05 crc kubenswrapper[4931]: I1201 15:35:05.651477 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbmv7\" (UniqueName: \"kubernetes.io/projected/a36b884a-b808-4e6c-9db0-20a6d5d56df8-kube-api-access-bbmv7\") pod \"redhat-operators-v6nm2\" (UID: \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\") " pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:05 crc kubenswrapper[4931]: I1201 15:35:05.651570 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36b884a-b808-4e6c-9db0-20a6d5d56df8-catalog-content\") pod \"redhat-operators-v6nm2\" (UID: \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\") " pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:05 crc kubenswrapper[4931]: I1201 15:35:05.753043 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbmv7\" (UniqueName: \"kubernetes.io/projected/a36b884a-b808-4e6c-9db0-20a6d5d56df8-kube-api-access-bbmv7\") pod \"redhat-operators-v6nm2\" (UID: \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\") " pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:05 crc kubenswrapper[4931]: I1201 15:35:05.753101 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36b884a-b808-4e6c-9db0-20a6d5d56df8-catalog-content\") pod \"redhat-operators-v6nm2\" (UID: \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\") " pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:05 crc kubenswrapper[4931]: I1201 15:35:05.753178 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36b884a-b808-4e6c-9db0-20a6d5d56df8-utilities\") pod \"redhat-operators-v6nm2\" (UID: \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\") " pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:05 crc kubenswrapper[4931]: I1201 15:35:05.753753 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36b884a-b808-4e6c-9db0-20a6d5d56df8-catalog-content\") pod \"redhat-operators-v6nm2\" (UID: \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\") " pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:05 crc kubenswrapper[4931]: I1201 15:35:05.753784 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36b884a-b808-4e6c-9db0-20a6d5d56df8-utilities\") pod \"redhat-operators-v6nm2\" (UID: \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\") " pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:05 crc kubenswrapper[4931]: I1201 15:35:05.780074 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbmv7\" (UniqueName: \"kubernetes.io/projected/a36b884a-b808-4e6c-9db0-20a6d5d56df8-kube-api-access-bbmv7\") pod \"redhat-operators-v6nm2\" (UID: \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\") " pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:05 crc kubenswrapper[4931]: I1201 15:35:05.805703 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:06 crc kubenswrapper[4931]: I1201 15:35:06.308931 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v6nm2"] Dec 01 15:35:07 crc kubenswrapper[4931]: I1201 15:35:07.191964 4931 generic.go:334] "Generic (PLEG): container finished" podID="a36b884a-b808-4e6c-9db0-20a6d5d56df8" containerID="01362cc9981ab3f89d36624436abc2fcaee3bf92ea927106b07187fa29fe6c0c" exitCode=0 Dec 01 15:35:07 crc kubenswrapper[4931]: I1201 15:35:07.192078 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6nm2" event={"ID":"a36b884a-b808-4e6c-9db0-20a6d5d56df8","Type":"ContainerDied","Data":"01362cc9981ab3f89d36624436abc2fcaee3bf92ea927106b07187fa29fe6c0c"} Dec 01 15:35:07 crc kubenswrapper[4931]: I1201 15:35:07.192274 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6nm2" event={"ID":"a36b884a-b808-4e6c-9db0-20a6d5d56df8","Type":"ContainerStarted","Data":"20e3d5617f98dff76ce3b1f4702cd11786572d81109c82d7582cb62eb632fbdf"} Dec 01 15:35:10 crc kubenswrapper[4931]: I1201 15:35:10.224419 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6nm2" event={"ID":"a36b884a-b808-4e6c-9db0-20a6d5d56df8","Type":"ContainerStarted","Data":"a861a24108ae3d89ae87249d688d3ce19111eff853b3fc43baa0e2897979c807"} Dec 01 15:35:13 crc kubenswrapper[4931]: I1201 15:35:13.249571 4931 generic.go:334] "Generic (PLEG): container finished" podID="a36b884a-b808-4e6c-9db0-20a6d5d56df8" containerID="a861a24108ae3d89ae87249d688d3ce19111eff853b3fc43baa0e2897979c807" exitCode=0 Dec 01 15:35:13 crc kubenswrapper[4931]: I1201 15:35:13.249626 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6nm2" event={"ID":"a36b884a-b808-4e6c-9db0-20a6d5d56df8","Type":"ContainerDied","Data":"a861a24108ae3d89ae87249d688d3ce19111eff853b3fc43baa0e2897979c807"} Dec 01 15:35:16 crc kubenswrapper[4931]: I1201 15:35:16.276442 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6nm2" event={"ID":"a36b884a-b808-4e6c-9db0-20a6d5d56df8","Type":"ContainerStarted","Data":"71664cbca42928c23ed495ec01c2f0e20371c370a50507dc97ef6aeb4a901185"} Dec 01 15:35:16 crc kubenswrapper[4931]: I1201 15:35:16.310611 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v6nm2" podStartSLOduration=2.997203529 podStartE2EDuration="11.310586273s" podCreationTimestamp="2025-12-01 15:35:05 +0000 UTC" firstStartedPulling="2025-12-01 15:35:07.194974967 +0000 UTC m=+2053.620848634" lastFinishedPulling="2025-12-01 15:35:15.508357711 +0000 UTC m=+2061.934231378" observedRunningTime="2025-12-01 15:35:16.300233354 +0000 UTC m=+2062.726107031" watchObservedRunningTime="2025-12-01 15:35:16.310586273 +0000 UTC m=+2062.736459950" Dec 01 15:35:25 crc kubenswrapper[4931]: I1201 15:35:25.806859 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:25 crc kubenswrapper[4931]: I1201 15:35:25.807441 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:25 crc kubenswrapper[4931]: I1201 15:35:25.859049 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:26 crc kubenswrapper[4931]: I1201 15:35:26.400237 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:26 crc kubenswrapper[4931]: I1201 15:35:26.447944 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v6nm2"] Dec 01 15:35:28 crc kubenswrapper[4931]: I1201 15:35:28.372629 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v6nm2" podUID="a36b884a-b808-4e6c-9db0-20a6d5d56df8" containerName="registry-server" containerID="cri-o://71664cbca42928c23ed495ec01c2f0e20371c370a50507dc97ef6aeb4a901185" gracePeriod=2 Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.381081 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.384427 4931 generic.go:334] "Generic (PLEG): container finished" podID="a36b884a-b808-4e6c-9db0-20a6d5d56df8" containerID="71664cbca42928c23ed495ec01c2f0e20371c370a50507dc97ef6aeb4a901185" exitCode=0 Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.384463 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6nm2" event={"ID":"a36b884a-b808-4e6c-9db0-20a6d5d56df8","Type":"ContainerDied","Data":"71664cbca42928c23ed495ec01c2f0e20371c370a50507dc97ef6aeb4a901185"} Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.384488 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v6nm2" event={"ID":"a36b884a-b808-4e6c-9db0-20a6d5d56df8","Type":"ContainerDied","Data":"20e3d5617f98dff76ce3b1f4702cd11786572d81109c82d7582cb62eb632fbdf"} Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.384503 4931 scope.go:117] "RemoveContainer" containerID="71664cbca42928c23ed495ec01c2f0e20371c370a50507dc97ef6aeb4a901185" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.384537 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v6nm2" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.409814 4931 scope.go:117] "RemoveContainer" containerID="a861a24108ae3d89ae87249d688d3ce19111eff853b3fc43baa0e2897979c807" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.441369 4931 scope.go:117] "RemoveContainer" containerID="01362cc9981ab3f89d36624436abc2fcaee3bf92ea927106b07187fa29fe6c0c" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.486282 4931 scope.go:117] "RemoveContainer" containerID="71664cbca42928c23ed495ec01c2f0e20371c370a50507dc97ef6aeb4a901185" Dec 01 15:35:29 crc kubenswrapper[4931]: E1201 15:35:29.486706 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71664cbca42928c23ed495ec01c2f0e20371c370a50507dc97ef6aeb4a901185\": container with ID starting with 71664cbca42928c23ed495ec01c2f0e20371c370a50507dc97ef6aeb4a901185 not found: ID does not exist" containerID="71664cbca42928c23ed495ec01c2f0e20371c370a50507dc97ef6aeb4a901185" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.486751 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71664cbca42928c23ed495ec01c2f0e20371c370a50507dc97ef6aeb4a901185"} err="failed to get container status \"71664cbca42928c23ed495ec01c2f0e20371c370a50507dc97ef6aeb4a901185\": rpc error: code = NotFound desc = could not find container \"71664cbca42928c23ed495ec01c2f0e20371c370a50507dc97ef6aeb4a901185\": container with ID starting with 71664cbca42928c23ed495ec01c2f0e20371c370a50507dc97ef6aeb4a901185 not found: ID does not exist" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.486780 4931 scope.go:117] "RemoveContainer" containerID="a861a24108ae3d89ae87249d688d3ce19111eff853b3fc43baa0e2897979c807" Dec 01 15:35:29 crc kubenswrapper[4931]: E1201 15:35:29.487160 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a861a24108ae3d89ae87249d688d3ce19111eff853b3fc43baa0e2897979c807\": container with ID starting with a861a24108ae3d89ae87249d688d3ce19111eff853b3fc43baa0e2897979c807 not found: ID does not exist" containerID="a861a24108ae3d89ae87249d688d3ce19111eff853b3fc43baa0e2897979c807" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.487182 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a861a24108ae3d89ae87249d688d3ce19111eff853b3fc43baa0e2897979c807"} err="failed to get container status \"a861a24108ae3d89ae87249d688d3ce19111eff853b3fc43baa0e2897979c807\": rpc error: code = NotFound desc = could not find container \"a861a24108ae3d89ae87249d688d3ce19111eff853b3fc43baa0e2897979c807\": container with ID starting with a861a24108ae3d89ae87249d688d3ce19111eff853b3fc43baa0e2897979c807 not found: ID does not exist" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.487196 4931 scope.go:117] "RemoveContainer" containerID="01362cc9981ab3f89d36624436abc2fcaee3bf92ea927106b07187fa29fe6c0c" Dec 01 15:35:29 crc kubenswrapper[4931]: E1201 15:35:29.487435 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01362cc9981ab3f89d36624436abc2fcaee3bf92ea927106b07187fa29fe6c0c\": container with ID starting with 01362cc9981ab3f89d36624436abc2fcaee3bf92ea927106b07187fa29fe6c0c not found: ID does not exist" containerID="01362cc9981ab3f89d36624436abc2fcaee3bf92ea927106b07187fa29fe6c0c" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.487469 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01362cc9981ab3f89d36624436abc2fcaee3bf92ea927106b07187fa29fe6c0c"} err="failed to get container status \"01362cc9981ab3f89d36624436abc2fcaee3bf92ea927106b07187fa29fe6c0c\": rpc error: code = NotFound desc = could not find container \"01362cc9981ab3f89d36624436abc2fcaee3bf92ea927106b07187fa29fe6c0c\": container with ID starting with 01362cc9981ab3f89d36624436abc2fcaee3bf92ea927106b07187fa29fe6c0c not found: ID does not exist" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.496005 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36b884a-b808-4e6c-9db0-20a6d5d56df8-utilities\") pod \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\" (UID: \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\") " Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.496176 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36b884a-b808-4e6c-9db0-20a6d5d56df8-catalog-content\") pod \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\" (UID: \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\") " Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.496397 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbmv7\" (UniqueName: \"kubernetes.io/projected/a36b884a-b808-4e6c-9db0-20a6d5d56df8-kube-api-access-bbmv7\") pod \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\" (UID: \"a36b884a-b808-4e6c-9db0-20a6d5d56df8\") " Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.496893 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a36b884a-b808-4e6c-9db0-20a6d5d56df8-utilities" (OuterVolumeSpecName: "utilities") pod "a36b884a-b808-4e6c-9db0-20a6d5d56df8" (UID: "a36b884a-b808-4e6c-9db0-20a6d5d56df8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.502599 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36b884a-b808-4e6c-9db0-20a6d5d56df8-kube-api-access-bbmv7" (OuterVolumeSpecName: "kube-api-access-bbmv7") pod "a36b884a-b808-4e6c-9db0-20a6d5d56df8" (UID: "a36b884a-b808-4e6c-9db0-20a6d5d56df8"). InnerVolumeSpecName "kube-api-access-bbmv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.598425 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbmv7\" (UniqueName: \"kubernetes.io/projected/a36b884a-b808-4e6c-9db0-20a6d5d56df8-kube-api-access-bbmv7\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.598453 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a36b884a-b808-4e6c-9db0-20a6d5d56df8-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.610231 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a36b884a-b808-4e6c-9db0-20a6d5d56df8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a36b884a-b808-4e6c-9db0-20a6d5d56df8" (UID: "a36b884a-b808-4e6c-9db0-20a6d5d56df8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.699764 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a36b884a-b808-4e6c-9db0-20a6d5d56df8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.713727 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v6nm2"] Dec 01 15:35:29 crc kubenswrapper[4931]: I1201 15:35:29.725709 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v6nm2"] Dec 01 15:35:30 crc kubenswrapper[4931]: I1201 15:35:30.254831 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a36b884a-b808-4e6c-9db0-20a6d5d56df8" path="/var/lib/kubelet/pods/a36b884a-b808-4e6c-9db0-20a6d5d56df8/volumes" Dec 01 15:35:32 crc kubenswrapper[4931]: I1201 15:35:32.410265 4931 generic.go:334] "Generic (PLEG): container finished" podID="624618ca-ac0b-4fcf-bcf8-a7e744c98241" containerID="b60697e8021dbe946859ae4f51e0ca2b139f8f7518ea325511c579338c477611" exitCode=0 Dec 01 15:35:32 crc kubenswrapper[4931]: I1201 15:35:32.410359 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" event={"ID":"624618ca-ac0b-4fcf-bcf8-a7e744c98241","Type":"ContainerDied","Data":"b60697e8021dbe946859ae4f51e0ca2b139f8f7518ea325511c579338c477611"} Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.816097 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905449 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-ovn-default-certs-0\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905506 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-neutron-metadata-combined-ca-bundle\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905540 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-nova-combined-ca-bundle\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905582 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905616 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-telemetry-combined-ca-bundle\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905635 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv4kh\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-kube-api-access-pv4kh\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905722 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-bootstrap-combined-ca-bundle\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905744 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905786 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-ssh-key\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905847 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905894 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-repo-setup-combined-ca-bundle\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905914 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-libvirt-combined-ca-bundle\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905941 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-inventory\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.905961 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-ovn-combined-ca-bundle\") pod \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\" (UID: \"624618ca-ac0b-4fcf-bcf8-a7e744c98241\") " Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.911913 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.912104 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-kube-api-access-pv4kh" (OuterVolumeSpecName: "kube-api-access-pv4kh") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "kube-api-access-pv4kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.912920 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.913518 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.913528 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.913717 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.914039 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.914760 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.925899 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.926492 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.926566 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.926698 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.938575 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:35:33 crc kubenswrapper[4931]: I1201 15:35:33.944628 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-inventory" (OuterVolumeSpecName: "inventory") pod "624618ca-ac0b-4fcf-bcf8-a7e744c98241" (UID: "624618ca-ac0b-4fcf-bcf8-a7e744c98241"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007793 4931 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007834 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007845 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007855 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007868 4931 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007877 4931 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007885 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007895 4931 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007904 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007915 4931 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007927 4931 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007940 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007957 4931 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624618ca-ac0b-4fcf-bcf8-a7e744c98241-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.007970 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv4kh\" (UniqueName: \"kubernetes.io/projected/624618ca-ac0b-4fcf-bcf8-a7e744c98241-kube-api-access-pv4kh\") on node \"crc\" DevicePath \"\"" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.428985 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" event={"ID":"624618ca-ac0b-4fcf-bcf8-a7e744c98241","Type":"ContainerDied","Data":"899b41b17153c38eec0f844bdbfc87a24ba7ed42ac4d465c06848ce9566df2d8"} Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.429028 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z54c5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.429034 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="899b41b17153c38eec0f844bdbfc87a24ba7ed42ac4d465c06848ce9566df2d8" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.522959 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5"] Dec 01 15:35:34 crc kubenswrapper[4931]: E1201 15:35:34.523326 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36b884a-b808-4e6c-9db0-20a6d5d56df8" containerName="extract-utilities" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.523344 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36b884a-b808-4e6c-9db0-20a6d5d56df8" containerName="extract-utilities" Dec 01 15:35:34 crc kubenswrapper[4931]: E1201 15:35:34.523371 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36b884a-b808-4e6c-9db0-20a6d5d56df8" containerName="extract-content" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.523378 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36b884a-b808-4e6c-9db0-20a6d5d56df8" containerName="extract-content" Dec 01 15:35:34 crc kubenswrapper[4931]: E1201 15:35:34.523417 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36b884a-b808-4e6c-9db0-20a6d5d56df8" containerName="registry-server" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.523428 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36b884a-b808-4e6c-9db0-20a6d5d56df8" containerName="registry-server" Dec 01 15:35:34 crc kubenswrapper[4931]: E1201 15:35:34.523447 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624618ca-ac0b-4fcf-bcf8-a7e744c98241" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.523458 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="624618ca-ac0b-4fcf-bcf8-a7e744c98241" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.523645 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36b884a-b808-4e6c-9db0-20a6d5d56df8" containerName="registry-server" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.523670 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="624618ca-ac0b-4fcf-bcf8-a7e744c98241" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.524266 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.528990 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.529160 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.529250 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.530540 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.533704 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.544733 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5"] Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.618182 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.618322 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.618420 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w56z5\" (UniqueName: \"kubernetes.io/projected/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-kube-api-access-w56z5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.618519 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.618610 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.723578 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.723655 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.723693 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w56z5\" (UniqueName: \"kubernetes.io/projected/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-kube-api-access-w56z5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.723711 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.723736 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.724605 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.733033 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.743147 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.752152 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.773428 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w56z5\" (UniqueName: \"kubernetes.io/projected/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-kube-api-access-w56z5\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-btzw5\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:34 crc kubenswrapper[4931]: I1201 15:35:34.843318 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:35:35 crc kubenswrapper[4931]: I1201 15:35:35.340052 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5"] Dec 01 15:35:35 crc kubenswrapper[4931]: I1201 15:35:35.436544 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" event={"ID":"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7","Type":"ContainerStarted","Data":"b9c0093efeabefbb1a819426632988111744285b6b78584097cb6aaf75e7406b"} Dec 01 15:35:37 crc kubenswrapper[4931]: I1201 15:35:37.455908 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" event={"ID":"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7","Type":"ContainerStarted","Data":"0ae7a0785b0c79fc65c9086030524f31114180fd35e414e7f71fdfec570a538c"} Dec 01 15:35:37 crc kubenswrapper[4931]: I1201 15:35:37.475008 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" podStartSLOduration=2.511134471 podStartE2EDuration="3.474986634s" podCreationTimestamp="2025-12-01 15:35:34 +0000 UTC" firstStartedPulling="2025-12-01 15:35:35.347255788 +0000 UTC m=+2081.773129455" lastFinishedPulling="2025-12-01 15:35:36.311107951 +0000 UTC m=+2082.736981618" observedRunningTime="2025-12-01 15:35:37.474809419 +0000 UTC m=+2083.900683086" watchObservedRunningTime="2025-12-01 15:35:37.474986634 +0000 UTC m=+2083.900860301" Dec 01 15:36:19 crc kubenswrapper[4931]: I1201 15:36:19.871643 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:36:19 crc kubenswrapper[4931]: I1201 15:36:19.872251 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:36:31 crc kubenswrapper[4931]: I1201 15:36:31.081690 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hq9cq"] Dec 01 15:36:31 crc kubenswrapper[4931]: I1201 15:36:31.084666 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:31 crc kubenswrapper[4931]: I1201 15:36:31.097741 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hq9cq"] Dec 01 15:36:31 crc kubenswrapper[4931]: I1201 15:36:31.173304 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40724855-06db-44a3-b66a-c55da123716a-utilities\") pod \"community-operators-hq9cq\" (UID: \"40724855-06db-44a3-b66a-c55da123716a\") " pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:31 crc kubenswrapper[4931]: I1201 15:36:31.173665 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfj76\" (UniqueName: \"kubernetes.io/projected/40724855-06db-44a3-b66a-c55da123716a-kube-api-access-pfj76\") pod \"community-operators-hq9cq\" (UID: \"40724855-06db-44a3-b66a-c55da123716a\") " pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:31 crc kubenswrapper[4931]: I1201 15:36:31.173854 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40724855-06db-44a3-b66a-c55da123716a-catalog-content\") pod \"community-operators-hq9cq\" (UID: \"40724855-06db-44a3-b66a-c55da123716a\") " pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:31 crc kubenswrapper[4931]: I1201 15:36:31.276140 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40724855-06db-44a3-b66a-c55da123716a-utilities\") pod \"community-operators-hq9cq\" (UID: \"40724855-06db-44a3-b66a-c55da123716a\") " pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:31 crc kubenswrapper[4931]: I1201 15:36:31.276196 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfj76\" (UniqueName: \"kubernetes.io/projected/40724855-06db-44a3-b66a-c55da123716a-kube-api-access-pfj76\") pod \"community-operators-hq9cq\" (UID: \"40724855-06db-44a3-b66a-c55da123716a\") " pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:31 crc kubenswrapper[4931]: I1201 15:36:31.276323 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40724855-06db-44a3-b66a-c55da123716a-catalog-content\") pod \"community-operators-hq9cq\" (UID: \"40724855-06db-44a3-b66a-c55da123716a\") " pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:31 crc kubenswrapper[4931]: I1201 15:36:31.276952 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40724855-06db-44a3-b66a-c55da123716a-catalog-content\") pod \"community-operators-hq9cq\" (UID: \"40724855-06db-44a3-b66a-c55da123716a\") " pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:31 crc kubenswrapper[4931]: I1201 15:36:31.277080 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40724855-06db-44a3-b66a-c55da123716a-utilities\") pod \"community-operators-hq9cq\" (UID: \"40724855-06db-44a3-b66a-c55da123716a\") " pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:31 crc kubenswrapper[4931]: I1201 15:36:31.310321 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfj76\" (UniqueName: \"kubernetes.io/projected/40724855-06db-44a3-b66a-c55da123716a-kube-api-access-pfj76\") pod \"community-operators-hq9cq\" (UID: \"40724855-06db-44a3-b66a-c55da123716a\") " pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:31 crc kubenswrapper[4931]: I1201 15:36:31.447146 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:32 crc kubenswrapper[4931]: I1201 15:36:32.010817 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hq9cq"] Dec 01 15:36:32 crc kubenswrapper[4931]: I1201 15:36:32.922288 4931 generic.go:334] "Generic (PLEG): container finished" podID="40724855-06db-44a3-b66a-c55da123716a" containerID="a0ce1ae9b64479c5959c056a62f60a56df6d44884670709aedc1b194bfddcf53" exitCode=0 Dec 01 15:36:32 crc kubenswrapper[4931]: I1201 15:36:32.922365 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hq9cq" event={"ID":"40724855-06db-44a3-b66a-c55da123716a","Type":"ContainerDied","Data":"a0ce1ae9b64479c5959c056a62f60a56df6d44884670709aedc1b194bfddcf53"} Dec 01 15:36:32 crc kubenswrapper[4931]: I1201 15:36:32.922561 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hq9cq" event={"ID":"40724855-06db-44a3-b66a-c55da123716a","Type":"ContainerStarted","Data":"d27c29fced388e523cc2ef09f28ac2c4aaa56056200aca58dd8ed7ab1c823b6b"} Dec 01 15:36:33 crc kubenswrapper[4931]: I1201 15:36:33.945580 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hq9cq" event={"ID":"40724855-06db-44a3-b66a-c55da123716a","Type":"ContainerStarted","Data":"482ad36e3658eaadd5b99ecfbae2f97ff0a2ba3a40aa63127da9631e7dd99a7c"} Dec 01 15:36:34 crc kubenswrapper[4931]: I1201 15:36:34.955526 4931 generic.go:334] "Generic (PLEG): container finished" podID="40724855-06db-44a3-b66a-c55da123716a" containerID="482ad36e3658eaadd5b99ecfbae2f97ff0a2ba3a40aa63127da9631e7dd99a7c" exitCode=0 Dec 01 15:36:34 crc kubenswrapper[4931]: I1201 15:36:34.955579 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hq9cq" event={"ID":"40724855-06db-44a3-b66a-c55da123716a","Type":"ContainerDied","Data":"482ad36e3658eaadd5b99ecfbae2f97ff0a2ba3a40aa63127da9631e7dd99a7c"} Dec 01 15:36:35 crc kubenswrapper[4931]: I1201 15:36:35.965827 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hq9cq" event={"ID":"40724855-06db-44a3-b66a-c55da123716a","Type":"ContainerStarted","Data":"86766f44697e9ca40476bca982b2df7b9fa56c8e2ea93eff9044cb0354b29197"} Dec 01 15:36:35 crc kubenswrapper[4931]: I1201 15:36:35.986408 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hq9cq" podStartSLOduration=2.568701945 podStartE2EDuration="4.986371736s" podCreationTimestamp="2025-12-01 15:36:31 +0000 UTC" firstStartedPulling="2025-12-01 15:36:32.924777022 +0000 UTC m=+2139.350650689" lastFinishedPulling="2025-12-01 15:36:35.342446803 +0000 UTC m=+2141.768320480" observedRunningTime="2025-12-01 15:36:35.983835815 +0000 UTC m=+2142.409709482" watchObservedRunningTime="2025-12-01 15:36:35.986371736 +0000 UTC m=+2142.412245403" Dec 01 15:36:41 crc kubenswrapper[4931]: I1201 15:36:41.447729 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:41 crc kubenswrapper[4931]: I1201 15:36:41.447998 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:41 crc kubenswrapper[4931]: I1201 15:36:41.496093 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:42 crc kubenswrapper[4931]: I1201 15:36:42.029833 4931 generic.go:334] "Generic (PLEG): container finished" podID="249f41eb-85d7-46f2-80d3-5f1ea0dcbda7" containerID="0ae7a0785b0c79fc65c9086030524f31114180fd35e414e7f71fdfec570a538c" exitCode=0 Dec 01 15:36:42 crc kubenswrapper[4931]: I1201 15:36:42.029908 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" event={"ID":"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7","Type":"ContainerDied","Data":"0ae7a0785b0c79fc65c9086030524f31114180fd35e414e7f71fdfec570a538c"} Dec 01 15:36:42 crc kubenswrapper[4931]: I1201 15:36:42.076511 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:42 crc kubenswrapper[4931]: I1201 15:36:42.126100 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hq9cq"] Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.484534 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.609116 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-inventory\") pod \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.609160 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ssh-key\") pod \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.609257 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ovncontroller-config-0\") pod \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.609303 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w56z5\" (UniqueName: \"kubernetes.io/projected/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-kube-api-access-w56z5\") pod \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.609374 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ovn-combined-ca-bundle\") pod \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\" (UID: \"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7\") " Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.615002 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-kube-api-access-w56z5" (OuterVolumeSpecName: "kube-api-access-w56z5") pod "249f41eb-85d7-46f2-80d3-5f1ea0dcbda7" (UID: "249f41eb-85d7-46f2-80d3-5f1ea0dcbda7"). InnerVolumeSpecName "kube-api-access-w56z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.615549 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "249f41eb-85d7-46f2-80d3-5f1ea0dcbda7" (UID: "249f41eb-85d7-46f2-80d3-5f1ea0dcbda7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.634602 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "249f41eb-85d7-46f2-80d3-5f1ea0dcbda7" (UID: "249f41eb-85d7-46f2-80d3-5f1ea0dcbda7"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.637071 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-inventory" (OuterVolumeSpecName: "inventory") pod "249f41eb-85d7-46f2-80d3-5f1ea0dcbda7" (UID: "249f41eb-85d7-46f2-80d3-5f1ea0dcbda7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.652339 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "249f41eb-85d7-46f2-80d3-5f1ea0dcbda7" (UID: "249f41eb-85d7-46f2-80d3-5f1ea0dcbda7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.711557 4931 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.711589 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.711600 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.711609 4931 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:36:43 crc kubenswrapper[4931]: I1201 15:36:43.711619 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w56z5\" (UniqueName: \"kubernetes.io/projected/249f41eb-85d7-46f2-80d3-5f1ea0dcbda7-kube-api-access-w56z5\") on node \"crc\" DevicePath \"\"" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.051761 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" event={"ID":"249f41eb-85d7-46f2-80d3-5f1ea0dcbda7","Type":"ContainerDied","Data":"b9c0093efeabefbb1a819426632988111744285b6b78584097cb6aaf75e7406b"} Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.051823 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9c0093efeabefbb1a819426632988111744285b6b78584097cb6aaf75e7406b" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.051772 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-btzw5" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.052497 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hq9cq" podUID="40724855-06db-44a3-b66a-c55da123716a" containerName="registry-server" containerID="cri-o://86766f44697e9ca40476bca982b2df7b9fa56c8e2ea93eff9044cb0354b29197" gracePeriod=2 Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.158525 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv"] Dec 01 15:36:44 crc kubenswrapper[4931]: E1201 15:36:44.159035 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249f41eb-85d7-46f2-80d3-5f1ea0dcbda7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.159054 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="249f41eb-85d7-46f2-80d3-5f1ea0dcbda7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.159294 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="249f41eb-85d7-46f2-80d3-5f1ea0dcbda7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.160288 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.163840 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.168510 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.168670 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.168755 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.168830 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.169009 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.176955 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv"] Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.329941 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.330342 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.330586 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.330658 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjrp8\" (UniqueName: \"kubernetes.io/projected/6225489b-6b7e-40c4-9f3e-e2e28b74d274-kube-api-access-zjrp8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.330706 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.330757 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.432260 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.432304 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.432659 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.432688 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjrp8\" (UniqueName: \"kubernetes.io/projected/6225489b-6b7e-40c4-9f3e-e2e28b74d274-kube-api-access-zjrp8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.432709 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.432729 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.439405 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.440247 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.441189 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.442039 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.449817 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.459431 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjrp8\" (UniqueName: \"kubernetes.io/projected/6225489b-6b7e-40c4-9f3e-e2e28b74d274-kube-api-access-zjrp8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.540734 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.633782 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.736918 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40724855-06db-44a3-b66a-c55da123716a-catalog-content\") pod \"40724855-06db-44a3-b66a-c55da123716a\" (UID: \"40724855-06db-44a3-b66a-c55da123716a\") " Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.737010 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfj76\" (UniqueName: \"kubernetes.io/projected/40724855-06db-44a3-b66a-c55da123716a-kube-api-access-pfj76\") pod \"40724855-06db-44a3-b66a-c55da123716a\" (UID: \"40724855-06db-44a3-b66a-c55da123716a\") " Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.737052 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40724855-06db-44a3-b66a-c55da123716a-utilities\") pod \"40724855-06db-44a3-b66a-c55da123716a\" (UID: \"40724855-06db-44a3-b66a-c55da123716a\") " Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.739148 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40724855-06db-44a3-b66a-c55da123716a-utilities" (OuterVolumeSpecName: "utilities") pod "40724855-06db-44a3-b66a-c55da123716a" (UID: "40724855-06db-44a3-b66a-c55da123716a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.742748 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40724855-06db-44a3-b66a-c55da123716a-kube-api-access-pfj76" (OuterVolumeSpecName: "kube-api-access-pfj76") pod "40724855-06db-44a3-b66a-c55da123716a" (UID: "40724855-06db-44a3-b66a-c55da123716a"). InnerVolumeSpecName "kube-api-access-pfj76". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.793216 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40724855-06db-44a3-b66a-c55da123716a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40724855-06db-44a3-b66a-c55da123716a" (UID: "40724855-06db-44a3-b66a-c55da123716a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.843944 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40724855-06db-44a3-b66a-c55da123716a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.844005 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfj76\" (UniqueName: \"kubernetes.io/projected/40724855-06db-44a3-b66a-c55da123716a-kube-api-access-pfj76\") on node \"crc\" DevicePath \"\"" Dec 01 15:36:44 crc kubenswrapper[4931]: I1201 15:36:44.844020 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40724855-06db-44a3-b66a-c55da123716a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.041236 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv"] Dec 01 15:36:45 crc kubenswrapper[4931]: W1201 15:36:45.044712 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6225489b_6b7e_40c4_9f3e_e2e28b74d274.slice/crio-80729d1e67c33076dd42c2a1708ed639797ef394b50f168bc0b0fe536a88af95 WatchSource:0}: Error finding container 80729d1e67c33076dd42c2a1708ed639797ef394b50f168bc0b0fe536a88af95: Status 404 returned error can't find the container with id 80729d1e67c33076dd42c2a1708ed639797ef394b50f168bc0b0fe536a88af95 Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.060547 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" event={"ID":"6225489b-6b7e-40c4-9f3e-e2e28b74d274","Type":"ContainerStarted","Data":"80729d1e67c33076dd42c2a1708ed639797ef394b50f168bc0b0fe536a88af95"} Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.063192 4931 generic.go:334] "Generic (PLEG): container finished" podID="40724855-06db-44a3-b66a-c55da123716a" containerID="86766f44697e9ca40476bca982b2df7b9fa56c8e2ea93eff9044cb0354b29197" exitCode=0 Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.063234 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hq9cq" event={"ID":"40724855-06db-44a3-b66a-c55da123716a","Type":"ContainerDied","Data":"86766f44697e9ca40476bca982b2df7b9fa56c8e2ea93eff9044cb0354b29197"} Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.063262 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hq9cq" Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.063287 4931 scope.go:117] "RemoveContainer" containerID="86766f44697e9ca40476bca982b2df7b9fa56c8e2ea93eff9044cb0354b29197" Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.063263 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hq9cq" event={"ID":"40724855-06db-44a3-b66a-c55da123716a","Type":"ContainerDied","Data":"d27c29fced388e523cc2ef09f28ac2c4aaa56056200aca58dd8ed7ab1c823b6b"} Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.086008 4931 scope.go:117] "RemoveContainer" containerID="482ad36e3658eaadd5b99ecfbae2f97ff0a2ba3a40aa63127da9631e7dd99a7c" Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.095762 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hq9cq"] Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.104108 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hq9cq"] Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.113475 4931 scope.go:117] "RemoveContainer" containerID="a0ce1ae9b64479c5959c056a62f60a56df6d44884670709aedc1b194bfddcf53" Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.131847 4931 scope.go:117] "RemoveContainer" containerID="86766f44697e9ca40476bca982b2df7b9fa56c8e2ea93eff9044cb0354b29197" Dec 01 15:36:45 crc kubenswrapper[4931]: E1201 15:36:45.132811 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86766f44697e9ca40476bca982b2df7b9fa56c8e2ea93eff9044cb0354b29197\": container with ID starting with 86766f44697e9ca40476bca982b2df7b9fa56c8e2ea93eff9044cb0354b29197 not found: ID does not exist" containerID="86766f44697e9ca40476bca982b2df7b9fa56c8e2ea93eff9044cb0354b29197" Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.132870 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86766f44697e9ca40476bca982b2df7b9fa56c8e2ea93eff9044cb0354b29197"} err="failed to get container status \"86766f44697e9ca40476bca982b2df7b9fa56c8e2ea93eff9044cb0354b29197\": rpc error: code = NotFound desc = could not find container \"86766f44697e9ca40476bca982b2df7b9fa56c8e2ea93eff9044cb0354b29197\": container with ID starting with 86766f44697e9ca40476bca982b2df7b9fa56c8e2ea93eff9044cb0354b29197 not found: ID does not exist" Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.132904 4931 scope.go:117] "RemoveContainer" containerID="482ad36e3658eaadd5b99ecfbae2f97ff0a2ba3a40aa63127da9631e7dd99a7c" Dec 01 15:36:45 crc kubenswrapper[4931]: E1201 15:36:45.133345 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"482ad36e3658eaadd5b99ecfbae2f97ff0a2ba3a40aa63127da9631e7dd99a7c\": container with ID starting with 482ad36e3658eaadd5b99ecfbae2f97ff0a2ba3a40aa63127da9631e7dd99a7c not found: ID does not exist" containerID="482ad36e3658eaadd5b99ecfbae2f97ff0a2ba3a40aa63127da9631e7dd99a7c" Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.133375 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"482ad36e3658eaadd5b99ecfbae2f97ff0a2ba3a40aa63127da9631e7dd99a7c"} err="failed to get container status \"482ad36e3658eaadd5b99ecfbae2f97ff0a2ba3a40aa63127da9631e7dd99a7c\": rpc error: code = NotFound desc = could not find container \"482ad36e3658eaadd5b99ecfbae2f97ff0a2ba3a40aa63127da9631e7dd99a7c\": container with ID starting with 482ad36e3658eaadd5b99ecfbae2f97ff0a2ba3a40aa63127da9631e7dd99a7c not found: ID does not exist" Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.133454 4931 scope.go:117] "RemoveContainer" containerID="a0ce1ae9b64479c5959c056a62f60a56df6d44884670709aedc1b194bfddcf53" Dec 01 15:36:45 crc kubenswrapper[4931]: E1201 15:36:45.133758 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0ce1ae9b64479c5959c056a62f60a56df6d44884670709aedc1b194bfddcf53\": container with ID starting with a0ce1ae9b64479c5959c056a62f60a56df6d44884670709aedc1b194bfddcf53 not found: ID does not exist" containerID="a0ce1ae9b64479c5959c056a62f60a56df6d44884670709aedc1b194bfddcf53" Dec 01 15:36:45 crc kubenswrapper[4931]: I1201 15:36:45.133781 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ce1ae9b64479c5959c056a62f60a56df6d44884670709aedc1b194bfddcf53"} err="failed to get container status \"a0ce1ae9b64479c5959c056a62f60a56df6d44884670709aedc1b194bfddcf53\": rpc error: code = NotFound desc = could not find container \"a0ce1ae9b64479c5959c056a62f60a56df6d44884670709aedc1b194bfddcf53\": container with ID starting with a0ce1ae9b64479c5959c056a62f60a56df6d44884670709aedc1b194bfddcf53 not found: ID does not exist" Dec 01 15:36:46 crc kubenswrapper[4931]: I1201 15:36:46.072874 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" event={"ID":"6225489b-6b7e-40c4-9f3e-e2e28b74d274","Type":"ContainerStarted","Data":"a016db7efcf2b2fd80c7e0f5b00f3b192c6d5e31b88de19b1764540376c2e68d"} Dec 01 15:36:46 crc kubenswrapper[4931]: I1201 15:36:46.095251 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" podStartSLOduration=1.334831367 podStartE2EDuration="2.095229324s" podCreationTimestamp="2025-12-01 15:36:44 +0000 UTC" firstStartedPulling="2025-12-01 15:36:45.047514715 +0000 UTC m=+2151.473388382" lastFinishedPulling="2025-12-01 15:36:45.807912672 +0000 UTC m=+2152.233786339" observedRunningTime="2025-12-01 15:36:46.087198 +0000 UTC m=+2152.513071697" watchObservedRunningTime="2025-12-01 15:36:46.095229324 +0000 UTC m=+2152.521103001" Dec 01 15:36:46 crc kubenswrapper[4931]: I1201 15:36:46.253426 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40724855-06db-44a3-b66a-c55da123716a" path="/var/lib/kubelet/pods/40724855-06db-44a3-b66a-c55da123716a/volumes" Dec 01 15:36:49 crc kubenswrapper[4931]: I1201 15:36:49.871613 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:36:49 crc kubenswrapper[4931]: I1201 15:36:49.873059 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:37:19 crc kubenswrapper[4931]: I1201 15:37:19.872561 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:37:19 crc kubenswrapper[4931]: I1201 15:37:19.873071 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:37:19 crc kubenswrapper[4931]: I1201 15:37:19.873109 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:37:19 crc kubenswrapper[4931]: I1201 15:37:19.873815 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98d69a0489f57745ef5af73760a1c1eabe377307f549877e60e78dd8e543b8a4"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 15:37:19 crc kubenswrapper[4931]: I1201 15:37:19.873869 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://98d69a0489f57745ef5af73760a1c1eabe377307f549877e60e78dd8e543b8a4" gracePeriod=600 Dec 01 15:37:20 crc kubenswrapper[4931]: I1201 15:37:20.385962 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="98d69a0489f57745ef5af73760a1c1eabe377307f549877e60e78dd8e543b8a4" exitCode=0 Dec 01 15:37:20 crc kubenswrapper[4931]: I1201 15:37:20.386049 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"98d69a0489f57745ef5af73760a1c1eabe377307f549877e60e78dd8e543b8a4"} Dec 01 15:37:20 crc kubenswrapper[4931]: I1201 15:37:20.386514 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700"} Dec 01 15:37:20 crc kubenswrapper[4931]: I1201 15:37:20.386550 4931 scope.go:117] "RemoveContainer" containerID="7f925e29ca5b94bf198c139ba1127bf2d13bec5b2c5bd9b5f7bc7437bdd25cb9" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.325481 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kc4cx"] Dec 01 15:37:31 crc kubenswrapper[4931]: E1201 15:37:31.326526 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40724855-06db-44a3-b66a-c55da123716a" containerName="extract-utilities" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.326546 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="40724855-06db-44a3-b66a-c55da123716a" containerName="extract-utilities" Dec 01 15:37:31 crc kubenswrapper[4931]: E1201 15:37:31.326580 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40724855-06db-44a3-b66a-c55da123716a" containerName="extract-content" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.326587 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="40724855-06db-44a3-b66a-c55da123716a" containerName="extract-content" Dec 01 15:37:31 crc kubenswrapper[4931]: E1201 15:37:31.326604 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40724855-06db-44a3-b66a-c55da123716a" containerName="registry-server" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.326611 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="40724855-06db-44a3-b66a-c55da123716a" containerName="registry-server" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.326841 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="40724855-06db-44a3-b66a-c55da123716a" containerName="registry-server" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.328524 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.343041 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kc4cx"] Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.438726 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14ab6d6c-a306-4818-b409-2d8b6f2cc673-catalog-content\") pod \"redhat-marketplace-kc4cx\" (UID: \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\") " pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.438802 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14ab6d6c-a306-4818-b409-2d8b6f2cc673-utilities\") pod \"redhat-marketplace-kc4cx\" (UID: \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\") " pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.438859 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kxjs\" (UniqueName: \"kubernetes.io/projected/14ab6d6c-a306-4818-b409-2d8b6f2cc673-kube-api-access-5kxjs\") pod \"redhat-marketplace-kc4cx\" (UID: \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\") " pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.540123 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14ab6d6c-a306-4818-b409-2d8b6f2cc673-catalog-content\") pod \"redhat-marketplace-kc4cx\" (UID: \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\") " pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.540186 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14ab6d6c-a306-4818-b409-2d8b6f2cc673-utilities\") pod \"redhat-marketplace-kc4cx\" (UID: \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\") " pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.540236 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kxjs\" (UniqueName: \"kubernetes.io/projected/14ab6d6c-a306-4818-b409-2d8b6f2cc673-kube-api-access-5kxjs\") pod \"redhat-marketplace-kc4cx\" (UID: \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\") " pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.540764 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14ab6d6c-a306-4818-b409-2d8b6f2cc673-catalog-content\") pod \"redhat-marketplace-kc4cx\" (UID: \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\") " pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.540958 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14ab6d6c-a306-4818-b409-2d8b6f2cc673-utilities\") pod \"redhat-marketplace-kc4cx\" (UID: \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\") " pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.563616 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kxjs\" (UniqueName: \"kubernetes.io/projected/14ab6d6c-a306-4818-b409-2d8b6f2cc673-kube-api-access-5kxjs\") pod \"redhat-marketplace-kc4cx\" (UID: \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\") " pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:31 crc kubenswrapper[4931]: I1201 15:37:31.670792 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:32 crc kubenswrapper[4931]: I1201 15:37:32.120567 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kc4cx"] Dec 01 15:37:32 crc kubenswrapper[4931]: I1201 15:37:32.509662 4931 generic.go:334] "Generic (PLEG): container finished" podID="14ab6d6c-a306-4818-b409-2d8b6f2cc673" containerID="42a08a35142ae33046e68027b06ee5cccccc63f8dd9b8c08c3dde58b6e9f2ea1" exitCode=0 Dec 01 15:37:32 crc kubenswrapper[4931]: I1201 15:37:32.509746 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kc4cx" event={"ID":"14ab6d6c-a306-4818-b409-2d8b6f2cc673","Type":"ContainerDied","Data":"42a08a35142ae33046e68027b06ee5cccccc63f8dd9b8c08c3dde58b6e9f2ea1"} Dec 01 15:37:32 crc kubenswrapper[4931]: I1201 15:37:32.510040 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kc4cx" event={"ID":"14ab6d6c-a306-4818-b409-2d8b6f2cc673","Type":"ContainerStarted","Data":"5a4c0354aefa7acd794ecd51984fe1f18c491e4dc9e5ea0144ba15d8e1792ecb"} Dec 01 15:37:33 crc kubenswrapper[4931]: I1201 15:37:33.521576 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kc4cx" event={"ID":"14ab6d6c-a306-4818-b409-2d8b6f2cc673","Type":"ContainerStarted","Data":"11eb69bdda189333cb1396d13f4f06d804ef91380936975a7358c8c568428da3"} Dec 01 15:37:34 crc kubenswrapper[4931]: I1201 15:37:34.536167 4931 generic.go:334] "Generic (PLEG): container finished" podID="6225489b-6b7e-40c4-9f3e-e2e28b74d274" containerID="a016db7efcf2b2fd80c7e0f5b00f3b192c6d5e31b88de19b1764540376c2e68d" exitCode=0 Dec 01 15:37:34 crc kubenswrapper[4931]: I1201 15:37:34.536319 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" event={"ID":"6225489b-6b7e-40c4-9f3e-e2e28b74d274","Type":"ContainerDied","Data":"a016db7efcf2b2fd80c7e0f5b00f3b192c6d5e31b88de19b1764540376c2e68d"} Dec 01 15:37:34 crc kubenswrapper[4931]: I1201 15:37:34.542279 4931 generic.go:334] "Generic (PLEG): container finished" podID="14ab6d6c-a306-4818-b409-2d8b6f2cc673" containerID="11eb69bdda189333cb1396d13f4f06d804ef91380936975a7358c8c568428da3" exitCode=0 Dec 01 15:37:34 crc kubenswrapper[4931]: I1201 15:37:34.542373 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kc4cx" event={"ID":"14ab6d6c-a306-4818-b409-2d8b6f2cc673","Type":"ContainerDied","Data":"11eb69bdda189333cb1396d13f4f06d804ef91380936975a7358c8c568428da3"} Dec 01 15:37:35 crc kubenswrapper[4931]: I1201 15:37:35.558247 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kc4cx" event={"ID":"14ab6d6c-a306-4818-b409-2d8b6f2cc673","Type":"ContainerStarted","Data":"bc7bce405e0f6b174cbae571ce61a3e35b03b4ec208e5a1ddc09834246cf23ad"} Dec 01 15:37:35 crc kubenswrapper[4931]: I1201 15:37:35.593575 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kc4cx" podStartSLOduration=2.094994652 podStartE2EDuration="4.593549016s" podCreationTimestamp="2025-12-01 15:37:31 +0000 UTC" firstStartedPulling="2025-12-01 15:37:32.511808129 +0000 UTC m=+2198.937681786" lastFinishedPulling="2025-12-01 15:37:35.010362483 +0000 UTC m=+2201.436236150" observedRunningTime="2025-12-01 15:37:35.580943335 +0000 UTC m=+2202.006817012" watchObservedRunningTime="2025-12-01 15:37:35.593549016 +0000 UTC m=+2202.019422683" Dec 01 15:37:35 crc kubenswrapper[4931]: I1201 15:37:35.963191 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.027011 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-neutron-metadata-combined-ca-bundle\") pod \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.027111 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjrp8\" (UniqueName: \"kubernetes.io/projected/6225489b-6b7e-40c4-9f3e-e2e28b74d274-kube-api-access-zjrp8\") pod \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.027157 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-inventory\") pod \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.027207 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-neutron-ovn-metadata-agent-neutron-config-0\") pod \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.027251 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-ssh-key\") pod \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.027275 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-nova-metadata-neutron-config-0\") pod \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\" (UID: \"6225489b-6b7e-40c4-9f3e-e2e28b74d274\") " Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.033645 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6225489b-6b7e-40c4-9f3e-e2e28b74d274" (UID: "6225489b-6b7e-40c4-9f3e-e2e28b74d274"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.035356 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6225489b-6b7e-40c4-9f3e-e2e28b74d274-kube-api-access-zjrp8" (OuterVolumeSpecName: "kube-api-access-zjrp8") pod "6225489b-6b7e-40c4-9f3e-e2e28b74d274" (UID: "6225489b-6b7e-40c4-9f3e-e2e28b74d274"). InnerVolumeSpecName "kube-api-access-zjrp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.059258 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "6225489b-6b7e-40c4-9f3e-e2e28b74d274" (UID: "6225489b-6b7e-40c4-9f3e-e2e28b74d274"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.062148 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6225489b-6b7e-40c4-9f3e-e2e28b74d274" (UID: "6225489b-6b7e-40c4-9f3e-e2e28b74d274"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.063888 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "6225489b-6b7e-40c4-9f3e-e2e28b74d274" (UID: "6225489b-6b7e-40c4-9f3e-e2e28b74d274"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.080150 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-inventory" (OuterVolumeSpecName: "inventory") pod "6225489b-6b7e-40c4-9f3e-e2e28b74d274" (UID: "6225489b-6b7e-40c4-9f3e-e2e28b74d274"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.130015 4931 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.130101 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjrp8\" (UniqueName: \"kubernetes.io/projected/6225489b-6b7e-40c4-9f3e-e2e28b74d274-kube-api-access-zjrp8\") on node \"crc\" DevicePath \"\"" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.130118 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.130151 4931 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.130168 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.130182 4931 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6225489b-6b7e-40c4-9f3e-e2e28b74d274-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.568395 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.568371 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv" event={"ID":"6225489b-6b7e-40c4-9f3e-e2e28b74d274","Type":"ContainerDied","Data":"80729d1e67c33076dd42c2a1708ed639797ef394b50f168bc0b0fe536a88af95"} Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.568445 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80729d1e67c33076dd42c2a1708ed639797ef394b50f168bc0b0fe536a88af95" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.661125 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr"] Dec 01 15:37:36 crc kubenswrapper[4931]: E1201 15:37:36.661539 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6225489b-6b7e-40c4-9f3e-e2e28b74d274" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.661555 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6225489b-6b7e-40c4-9f3e-e2e28b74d274" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.661769 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6225489b-6b7e-40c4-9f3e-e2e28b74d274" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.662504 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.665924 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.666137 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.666604 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.666805 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.666914 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.678299 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr"] Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.740628 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.740672 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.740695 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.740944 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.741020 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t6x2\" (UniqueName: \"kubernetes.io/projected/c016013e-7315-4763-9b4d-0876e4c2068f-kube-api-access-4t6x2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.842259 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.842303 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.842327 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.842407 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.842436 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t6x2\" (UniqueName: \"kubernetes.io/projected/c016013e-7315-4763-9b4d-0876e4c2068f-kube-api-access-4t6x2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.846946 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.847068 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.847131 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.853284 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.859328 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t6x2\" (UniqueName: \"kubernetes.io/projected/c016013e-7315-4763-9b4d-0876e4c2068f-kube-api-access-4t6x2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vrghr\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:36 crc kubenswrapper[4931]: I1201 15:37:36.994831 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:37:37 crc kubenswrapper[4931]: I1201 15:37:37.492731 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr"] Dec 01 15:37:37 crc kubenswrapper[4931]: W1201 15:37:37.497887 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc016013e_7315_4763_9b4d_0876e4c2068f.slice/crio-9f335065c261174227ad2daad2d37e4b63f666a30afa141ab138aefdfeaa869e WatchSource:0}: Error finding container 9f335065c261174227ad2daad2d37e4b63f666a30afa141ab138aefdfeaa869e: Status 404 returned error can't find the container with id 9f335065c261174227ad2daad2d37e4b63f666a30afa141ab138aefdfeaa869e Dec 01 15:37:37 crc kubenswrapper[4931]: I1201 15:37:37.578184 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" event={"ID":"c016013e-7315-4763-9b4d-0876e4c2068f","Type":"ContainerStarted","Data":"9f335065c261174227ad2daad2d37e4b63f666a30afa141ab138aefdfeaa869e"} Dec 01 15:37:38 crc kubenswrapper[4931]: I1201 15:37:38.590852 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" event={"ID":"c016013e-7315-4763-9b4d-0876e4c2068f","Type":"ContainerStarted","Data":"06b6dfbd2a60ee68d4c7929e40bacd9508c9bad1de87ac5032f7d05e8a43c149"} Dec 01 15:37:38 crc kubenswrapper[4931]: I1201 15:37:38.613204 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" podStartSLOduration=2.097969251 podStartE2EDuration="2.61317969s" podCreationTimestamp="2025-12-01 15:37:36 +0000 UTC" firstStartedPulling="2025-12-01 15:37:37.500859313 +0000 UTC m=+2203.926732980" lastFinishedPulling="2025-12-01 15:37:38.016069752 +0000 UTC m=+2204.441943419" observedRunningTime="2025-12-01 15:37:38.608414758 +0000 UTC m=+2205.034288445" watchObservedRunningTime="2025-12-01 15:37:38.61317969 +0000 UTC m=+2205.039053357" Dec 01 15:37:41 crc kubenswrapper[4931]: I1201 15:37:41.671302 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:41 crc kubenswrapper[4931]: I1201 15:37:41.672213 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:41 crc kubenswrapper[4931]: I1201 15:37:41.732782 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:42 crc kubenswrapper[4931]: I1201 15:37:42.687914 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:42 crc kubenswrapper[4931]: I1201 15:37:42.735951 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kc4cx"] Dec 01 15:37:44 crc kubenswrapper[4931]: I1201 15:37:44.662778 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kc4cx" podUID="14ab6d6c-a306-4818-b409-2d8b6f2cc673" containerName="registry-server" containerID="cri-o://bc7bce405e0f6b174cbae571ce61a3e35b03b4ec208e5a1ddc09834246cf23ad" gracePeriod=2 Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.629498 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.676157 4931 generic.go:334] "Generic (PLEG): container finished" podID="14ab6d6c-a306-4818-b409-2d8b6f2cc673" containerID="bc7bce405e0f6b174cbae571ce61a3e35b03b4ec208e5a1ddc09834246cf23ad" exitCode=0 Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.676207 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kc4cx" event={"ID":"14ab6d6c-a306-4818-b409-2d8b6f2cc673","Type":"ContainerDied","Data":"bc7bce405e0f6b174cbae571ce61a3e35b03b4ec208e5a1ddc09834246cf23ad"} Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.676242 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kc4cx" event={"ID":"14ab6d6c-a306-4818-b409-2d8b6f2cc673","Type":"ContainerDied","Data":"5a4c0354aefa7acd794ecd51984fe1f18c491e4dc9e5ea0144ba15d8e1792ecb"} Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.676269 4931 scope.go:117] "RemoveContainer" containerID="bc7bce405e0f6b174cbae571ce61a3e35b03b4ec208e5a1ddc09834246cf23ad" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.676302 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kc4cx" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.699880 4931 scope.go:117] "RemoveContainer" containerID="11eb69bdda189333cb1396d13f4f06d804ef91380936975a7358c8c568428da3" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.720424 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14ab6d6c-a306-4818-b409-2d8b6f2cc673-catalog-content\") pod \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\" (UID: \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\") " Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.720578 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14ab6d6c-a306-4818-b409-2d8b6f2cc673-utilities\") pod \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\" (UID: \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\") " Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.720606 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kxjs\" (UniqueName: \"kubernetes.io/projected/14ab6d6c-a306-4818-b409-2d8b6f2cc673-kube-api-access-5kxjs\") pod \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\" (UID: \"14ab6d6c-a306-4818-b409-2d8b6f2cc673\") " Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.721406 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14ab6d6c-a306-4818-b409-2d8b6f2cc673-utilities" (OuterVolumeSpecName: "utilities") pod "14ab6d6c-a306-4818-b409-2d8b6f2cc673" (UID: "14ab6d6c-a306-4818-b409-2d8b6f2cc673"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.722796 4931 scope.go:117] "RemoveContainer" containerID="42a08a35142ae33046e68027b06ee5cccccc63f8dd9b8c08c3dde58b6e9f2ea1" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.727135 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ab6d6c-a306-4818-b409-2d8b6f2cc673-kube-api-access-5kxjs" (OuterVolumeSpecName: "kube-api-access-5kxjs") pod "14ab6d6c-a306-4818-b409-2d8b6f2cc673" (UID: "14ab6d6c-a306-4818-b409-2d8b6f2cc673"). InnerVolumeSpecName "kube-api-access-5kxjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.739932 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14ab6d6c-a306-4818-b409-2d8b6f2cc673-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14ab6d6c-a306-4818-b409-2d8b6f2cc673" (UID: "14ab6d6c-a306-4818-b409-2d8b6f2cc673"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.809270 4931 scope.go:117] "RemoveContainer" containerID="bc7bce405e0f6b174cbae571ce61a3e35b03b4ec208e5a1ddc09834246cf23ad" Dec 01 15:37:45 crc kubenswrapper[4931]: E1201 15:37:45.809855 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc7bce405e0f6b174cbae571ce61a3e35b03b4ec208e5a1ddc09834246cf23ad\": container with ID starting with bc7bce405e0f6b174cbae571ce61a3e35b03b4ec208e5a1ddc09834246cf23ad not found: ID does not exist" containerID="bc7bce405e0f6b174cbae571ce61a3e35b03b4ec208e5a1ddc09834246cf23ad" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.809890 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7bce405e0f6b174cbae571ce61a3e35b03b4ec208e5a1ddc09834246cf23ad"} err="failed to get container status \"bc7bce405e0f6b174cbae571ce61a3e35b03b4ec208e5a1ddc09834246cf23ad\": rpc error: code = NotFound desc = could not find container \"bc7bce405e0f6b174cbae571ce61a3e35b03b4ec208e5a1ddc09834246cf23ad\": container with ID starting with bc7bce405e0f6b174cbae571ce61a3e35b03b4ec208e5a1ddc09834246cf23ad not found: ID does not exist" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.809913 4931 scope.go:117] "RemoveContainer" containerID="11eb69bdda189333cb1396d13f4f06d804ef91380936975a7358c8c568428da3" Dec 01 15:37:45 crc kubenswrapper[4931]: E1201 15:37:45.810400 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11eb69bdda189333cb1396d13f4f06d804ef91380936975a7358c8c568428da3\": container with ID starting with 11eb69bdda189333cb1396d13f4f06d804ef91380936975a7358c8c568428da3 not found: ID does not exist" containerID="11eb69bdda189333cb1396d13f4f06d804ef91380936975a7358c8c568428da3" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.810430 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11eb69bdda189333cb1396d13f4f06d804ef91380936975a7358c8c568428da3"} err="failed to get container status \"11eb69bdda189333cb1396d13f4f06d804ef91380936975a7358c8c568428da3\": rpc error: code = NotFound desc = could not find container \"11eb69bdda189333cb1396d13f4f06d804ef91380936975a7358c8c568428da3\": container with ID starting with 11eb69bdda189333cb1396d13f4f06d804ef91380936975a7358c8c568428da3 not found: ID does not exist" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.810451 4931 scope.go:117] "RemoveContainer" containerID="42a08a35142ae33046e68027b06ee5cccccc63f8dd9b8c08c3dde58b6e9f2ea1" Dec 01 15:37:45 crc kubenswrapper[4931]: E1201 15:37:45.810902 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42a08a35142ae33046e68027b06ee5cccccc63f8dd9b8c08c3dde58b6e9f2ea1\": container with ID starting with 42a08a35142ae33046e68027b06ee5cccccc63f8dd9b8c08c3dde58b6e9f2ea1 not found: ID does not exist" containerID="42a08a35142ae33046e68027b06ee5cccccc63f8dd9b8c08c3dde58b6e9f2ea1" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.810995 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a08a35142ae33046e68027b06ee5cccccc63f8dd9b8c08c3dde58b6e9f2ea1"} err="failed to get container status \"42a08a35142ae33046e68027b06ee5cccccc63f8dd9b8c08c3dde58b6e9f2ea1\": rpc error: code = NotFound desc = could not find container \"42a08a35142ae33046e68027b06ee5cccccc63f8dd9b8c08c3dde58b6e9f2ea1\": container with ID starting with 42a08a35142ae33046e68027b06ee5cccccc63f8dd9b8c08c3dde58b6e9f2ea1 not found: ID does not exist" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.823363 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14ab6d6c-a306-4818-b409-2d8b6f2cc673-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.823425 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kxjs\" (UniqueName: \"kubernetes.io/projected/14ab6d6c-a306-4818-b409-2d8b6f2cc673-kube-api-access-5kxjs\") on node \"crc\" DevicePath \"\"" Dec 01 15:37:45 crc kubenswrapper[4931]: I1201 15:37:45.823436 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14ab6d6c-a306-4818-b409-2d8b6f2cc673-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:37:46 crc kubenswrapper[4931]: I1201 15:37:46.039864 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kc4cx"] Dec 01 15:37:46 crc kubenswrapper[4931]: I1201 15:37:46.044872 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kc4cx"] Dec 01 15:37:46 crc kubenswrapper[4931]: I1201 15:37:46.253851 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14ab6d6c-a306-4818-b409-2d8b6f2cc673" path="/var/lib/kubelet/pods/14ab6d6c-a306-4818-b409-2d8b6f2cc673/volumes" Dec 01 15:39:49 crc kubenswrapper[4931]: I1201 15:39:49.872445 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:39:49 crc kubenswrapper[4931]: I1201 15:39:49.872996 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:40:19 crc kubenswrapper[4931]: I1201 15:40:19.871841 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:40:19 crc kubenswrapper[4931]: I1201 15:40:19.872493 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:40:49 crc kubenswrapper[4931]: I1201 15:40:49.871947 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:40:49 crc kubenswrapper[4931]: I1201 15:40:49.872766 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:40:49 crc kubenswrapper[4931]: I1201 15:40:49.872850 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:40:49 crc kubenswrapper[4931]: I1201 15:40:49.874350 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 15:40:49 crc kubenswrapper[4931]: I1201 15:40:49.874526 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" gracePeriod=600 Dec 01 15:40:50 crc kubenswrapper[4931]: E1201 15:40:50.005095 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:40:50 crc kubenswrapper[4931]: I1201 15:40:50.661208 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" exitCode=0 Dec 01 15:40:50 crc kubenswrapper[4931]: I1201 15:40:50.661330 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700"} Dec 01 15:40:50 crc kubenswrapper[4931]: I1201 15:40:50.661598 4931 scope.go:117] "RemoveContainer" containerID="98d69a0489f57745ef5af73760a1c1eabe377307f549877e60e78dd8e543b8a4" Dec 01 15:40:50 crc kubenswrapper[4931]: I1201 15:40:50.662346 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:40:50 crc kubenswrapper[4931]: E1201 15:40:50.662643 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:41:06 crc kubenswrapper[4931]: I1201 15:41:06.241343 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:41:06 crc kubenswrapper[4931]: E1201 15:41:06.242098 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:41:17 crc kubenswrapper[4931]: I1201 15:41:17.242271 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:41:17 crc kubenswrapper[4931]: E1201 15:41:17.243137 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:41:32 crc kubenswrapper[4931]: I1201 15:41:32.243088 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:41:32 crc kubenswrapper[4931]: E1201 15:41:32.244196 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:41:43 crc kubenswrapper[4931]: I1201 15:41:43.242429 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:41:43 crc kubenswrapper[4931]: E1201 15:41:43.243562 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:41:54 crc kubenswrapper[4931]: I1201 15:41:54.246916 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:41:54 crc kubenswrapper[4931]: E1201 15:41:54.247687 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:42:06 crc kubenswrapper[4931]: I1201 15:42:06.242283 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:42:06 crc kubenswrapper[4931]: E1201 15:42:06.242990 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:42:17 crc kubenswrapper[4931]: I1201 15:42:17.445553 4931 generic.go:334] "Generic (PLEG): container finished" podID="c016013e-7315-4763-9b4d-0876e4c2068f" containerID="06b6dfbd2a60ee68d4c7929e40bacd9508c9bad1de87ac5032f7d05e8a43c149" exitCode=0 Dec 01 15:42:17 crc kubenswrapper[4931]: I1201 15:42:17.445650 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" event={"ID":"c016013e-7315-4763-9b4d-0876e4c2068f","Type":"ContainerDied","Data":"06b6dfbd2a60ee68d4c7929e40bacd9508c9bad1de87ac5032f7d05e8a43c149"} Dec 01 15:42:18 crc kubenswrapper[4931]: I1201 15:42:18.846253 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:42:18 crc kubenswrapper[4931]: I1201 15:42:18.959498 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-libvirt-secret-0\") pod \"c016013e-7315-4763-9b4d-0876e4c2068f\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " Dec 01 15:42:18 crc kubenswrapper[4931]: I1201 15:42:18.959607 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-libvirt-combined-ca-bundle\") pod \"c016013e-7315-4763-9b4d-0876e4c2068f\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " Dec 01 15:42:18 crc kubenswrapper[4931]: I1201 15:42:18.959664 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-ssh-key\") pod \"c016013e-7315-4763-9b4d-0876e4c2068f\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " Dec 01 15:42:18 crc kubenswrapper[4931]: I1201 15:42:18.959693 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t6x2\" (UniqueName: \"kubernetes.io/projected/c016013e-7315-4763-9b4d-0876e4c2068f-kube-api-access-4t6x2\") pod \"c016013e-7315-4763-9b4d-0876e4c2068f\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " Dec 01 15:42:18 crc kubenswrapper[4931]: I1201 15:42:18.959810 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-inventory\") pod \"c016013e-7315-4763-9b4d-0876e4c2068f\" (UID: \"c016013e-7315-4763-9b4d-0876e4c2068f\") " Dec 01 15:42:18 crc kubenswrapper[4931]: I1201 15:42:18.966587 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c016013e-7315-4763-9b4d-0876e4c2068f-kube-api-access-4t6x2" (OuterVolumeSpecName: "kube-api-access-4t6x2") pod "c016013e-7315-4763-9b4d-0876e4c2068f" (UID: "c016013e-7315-4763-9b4d-0876e4c2068f"). InnerVolumeSpecName "kube-api-access-4t6x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:42:18 crc kubenswrapper[4931]: I1201 15:42:18.966604 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c016013e-7315-4763-9b4d-0876e4c2068f" (UID: "c016013e-7315-4763-9b4d-0876e4c2068f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:42:18 crc kubenswrapper[4931]: I1201 15:42:18.991225 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "c016013e-7315-4763-9b4d-0876e4c2068f" (UID: "c016013e-7315-4763-9b4d-0876e4c2068f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:42:18 crc kubenswrapper[4931]: I1201 15:42:18.991622 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-inventory" (OuterVolumeSpecName: "inventory") pod "c016013e-7315-4763-9b4d-0876e4c2068f" (UID: "c016013e-7315-4763-9b4d-0876e4c2068f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.000293 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c016013e-7315-4763-9b4d-0876e4c2068f" (UID: "c016013e-7315-4763-9b4d-0876e4c2068f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.061817 4931 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.061852 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.061862 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t6x2\" (UniqueName: \"kubernetes.io/projected/c016013e-7315-4763-9b4d-0876e4c2068f-kube-api-access-4t6x2\") on node \"crc\" DevicePath \"\"" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.061873 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.061884 4931 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c016013e-7315-4763-9b4d-0876e4c2068f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.465800 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" event={"ID":"c016013e-7315-4763-9b4d-0876e4c2068f","Type":"ContainerDied","Data":"9f335065c261174227ad2daad2d37e4b63f666a30afa141ab138aefdfeaa869e"} Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.465846 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f335065c261174227ad2daad2d37e4b63f666a30afa141ab138aefdfeaa869e" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.465866 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vrghr" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.557610 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns"] Dec 01 15:42:19 crc kubenswrapper[4931]: E1201 15:42:19.558060 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ab6d6c-a306-4818-b409-2d8b6f2cc673" containerName="extract-utilities" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.558083 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ab6d6c-a306-4818-b409-2d8b6f2cc673" containerName="extract-utilities" Dec 01 15:42:19 crc kubenswrapper[4931]: E1201 15:42:19.558106 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ab6d6c-a306-4818-b409-2d8b6f2cc673" containerName="registry-server" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.558114 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ab6d6c-a306-4818-b409-2d8b6f2cc673" containerName="registry-server" Dec 01 15:42:19 crc kubenswrapper[4931]: E1201 15:42:19.558164 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c016013e-7315-4763-9b4d-0876e4c2068f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.558176 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c016013e-7315-4763-9b4d-0876e4c2068f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 15:42:19 crc kubenswrapper[4931]: E1201 15:42:19.558204 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ab6d6c-a306-4818-b409-2d8b6f2cc673" containerName="extract-content" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.558213 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ab6d6c-a306-4818-b409-2d8b6f2cc673" containerName="extract-content" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.558450 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c016013e-7315-4763-9b4d-0876e4c2068f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.558490 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ab6d6c-a306-4818-b409-2d8b6f2cc673" containerName="registry-server" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.559226 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.562890 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.563227 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.563524 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.563684 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.563963 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.564180 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.564344 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.570059 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns"] Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.673572 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.673649 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.673781 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.673906 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.673986 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5zfk\" (UniqueName: \"kubernetes.io/projected/90baac5e-c041-4bd1-bba8-b11e708370e7-kube-api-access-v5zfk\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.674073 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.674136 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.674226 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.674242 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.775543 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.775592 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.775696 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.775745 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.775776 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.775807 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.775847 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5zfk\" (UniqueName: \"kubernetes.io/projected/90baac5e-c041-4bd1-bba8-b11e708370e7-kube-api-access-v5zfk\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.775894 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.775931 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.777671 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.780600 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.783744 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.785464 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.785916 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.793520 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.796280 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.798288 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.800332 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5zfk\" (UniqueName: \"kubernetes.io/projected/90baac5e-c041-4bd1-bba8-b11e708370e7-kube-api-access-v5zfk\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qbgns\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:19 crc kubenswrapper[4931]: I1201 15:42:19.878744 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:42:20 crc kubenswrapper[4931]: I1201 15:42:20.242913 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:42:20 crc kubenswrapper[4931]: E1201 15:42:20.243335 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:42:20 crc kubenswrapper[4931]: I1201 15:42:20.404763 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns"] Dec 01 15:42:20 crc kubenswrapper[4931]: I1201 15:42:20.410300 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 15:42:20 crc kubenswrapper[4931]: I1201 15:42:20.475183 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" event={"ID":"90baac5e-c041-4bd1-bba8-b11e708370e7","Type":"ContainerStarted","Data":"16da4afa9d6a1fe3b8b44d788b01e0c2907961d42c7bd8c9aea0ec3fcaebd17e"} Dec 01 15:42:21 crc kubenswrapper[4931]: I1201 15:42:21.483209 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" event={"ID":"90baac5e-c041-4bd1-bba8-b11e708370e7","Type":"ContainerStarted","Data":"885f824143c11c326c0ee9918526d9ef1c72a0a02161cb29cb0b987c3c2a207e"} Dec 01 15:42:21 crc kubenswrapper[4931]: I1201 15:42:21.502496 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" podStartSLOduration=1.945567589 podStartE2EDuration="2.502476368s" podCreationTimestamp="2025-12-01 15:42:19 +0000 UTC" firstStartedPulling="2025-12-01 15:42:20.410057284 +0000 UTC m=+2486.835930951" lastFinishedPulling="2025-12-01 15:42:20.966966063 +0000 UTC m=+2487.392839730" observedRunningTime="2025-12-01 15:42:21.50182981 +0000 UTC m=+2487.927703487" watchObservedRunningTime="2025-12-01 15:42:21.502476368 +0000 UTC m=+2487.928350035" Dec 01 15:42:34 crc kubenswrapper[4931]: I1201 15:42:34.254519 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:42:34 crc kubenswrapper[4931]: E1201 15:42:34.255554 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:42:49 crc kubenswrapper[4931]: I1201 15:42:49.241556 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:42:49 crc kubenswrapper[4931]: E1201 15:42:49.242381 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:43:04 crc kubenswrapper[4931]: I1201 15:43:04.250980 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:43:04 crc kubenswrapper[4931]: E1201 15:43:04.251838 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:43:19 crc kubenswrapper[4931]: I1201 15:43:19.242355 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:43:19 crc kubenswrapper[4931]: E1201 15:43:19.243082 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:43:30 crc kubenswrapper[4931]: I1201 15:43:30.242305 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:43:30 crc kubenswrapper[4931]: E1201 15:43:30.243081 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:43:43 crc kubenswrapper[4931]: I1201 15:43:43.242034 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:43:43 crc kubenswrapper[4931]: E1201 15:43:43.242960 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:43:57 crc kubenswrapper[4931]: I1201 15:43:57.244460 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:43:57 crc kubenswrapper[4931]: E1201 15:43:57.245639 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:44:08 crc kubenswrapper[4931]: I1201 15:44:08.242020 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:44:08 crc kubenswrapper[4931]: E1201 15:44:08.243247 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:44:20 crc kubenswrapper[4931]: I1201 15:44:20.241200 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:44:20 crc kubenswrapper[4931]: E1201 15:44:20.241930 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:44:33 crc kubenswrapper[4931]: I1201 15:44:33.241498 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:44:33 crc kubenswrapper[4931]: E1201 15:44:33.242235 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:44:48 crc kubenswrapper[4931]: I1201 15:44:48.241784 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:44:48 crc kubenswrapper[4931]: E1201 15:44:48.242845 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.151664 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p"] Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.154847 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.157144 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.157438 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.160422 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p"] Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.223346 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88519f43-0eae-4aa3-8a29-b349816cd21c-config-volume\") pod \"collect-profiles-29410065-cx99p\" (UID: \"88519f43-0eae-4aa3-8a29-b349816cd21c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.223801 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88519f43-0eae-4aa3-8a29-b349816cd21c-secret-volume\") pod \"collect-profiles-29410065-cx99p\" (UID: \"88519f43-0eae-4aa3-8a29-b349816cd21c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.223886 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntcp2\" (UniqueName: \"kubernetes.io/projected/88519f43-0eae-4aa3-8a29-b349816cd21c-kube-api-access-ntcp2\") pod \"collect-profiles-29410065-cx99p\" (UID: \"88519f43-0eae-4aa3-8a29-b349816cd21c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.241659 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:45:00 crc kubenswrapper[4931]: E1201 15:45:00.242000 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.326122 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88519f43-0eae-4aa3-8a29-b349816cd21c-config-volume\") pod \"collect-profiles-29410065-cx99p\" (UID: \"88519f43-0eae-4aa3-8a29-b349816cd21c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.326272 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88519f43-0eae-4aa3-8a29-b349816cd21c-secret-volume\") pod \"collect-profiles-29410065-cx99p\" (UID: \"88519f43-0eae-4aa3-8a29-b349816cd21c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.326340 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntcp2\" (UniqueName: \"kubernetes.io/projected/88519f43-0eae-4aa3-8a29-b349816cd21c-kube-api-access-ntcp2\") pod \"collect-profiles-29410065-cx99p\" (UID: \"88519f43-0eae-4aa3-8a29-b349816cd21c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.326998 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88519f43-0eae-4aa3-8a29-b349816cd21c-config-volume\") pod \"collect-profiles-29410065-cx99p\" (UID: \"88519f43-0eae-4aa3-8a29-b349816cd21c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.332211 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88519f43-0eae-4aa3-8a29-b349816cd21c-secret-volume\") pod \"collect-profiles-29410065-cx99p\" (UID: \"88519f43-0eae-4aa3-8a29-b349816cd21c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.341432 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntcp2\" (UniqueName: \"kubernetes.io/projected/88519f43-0eae-4aa3-8a29-b349816cd21c-kube-api-access-ntcp2\") pod \"collect-profiles-29410065-cx99p\" (UID: \"88519f43-0eae-4aa3-8a29-b349816cd21c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.517519 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" Dec 01 15:45:00 crc kubenswrapper[4931]: I1201 15:45:00.975732 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p"] Dec 01 15:45:00 crc kubenswrapper[4931]: W1201 15:45:00.984062 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88519f43_0eae_4aa3_8a29_b349816cd21c.slice/crio-17885a33470e13e9614ef2b2cbc375de58f29b340d03c654e46a8e0d7adec422 WatchSource:0}: Error finding container 17885a33470e13e9614ef2b2cbc375de58f29b340d03c654e46a8e0d7adec422: Status 404 returned error can't find the container with id 17885a33470e13e9614ef2b2cbc375de58f29b340d03c654e46a8e0d7adec422 Dec 01 15:45:01 crc kubenswrapper[4931]: I1201 15:45:01.039179 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" event={"ID":"88519f43-0eae-4aa3-8a29-b349816cd21c","Type":"ContainerStarted","Data":"17885a33470e13e9614ef2b2cbc375de58f29b340d03c654e46a8e0d7adec422"} Dec 01 15:45:02 crc kubenswrapper[4931]: I1201 15:45:02.052761 4931 generic.go:334] "Generic (PLEG): container finished" podID="88519f43-0eae-4aa3-8a29-b349816cd21c" containerID="0d40ec967ce144ea2a9a67f3f6967aa340326e683c3d61ac8cee65babc5a5f04" exitCode=0 Dec 01 15:45:02 crc kubenswrapper[4931]: I1201 15:45:02.052842 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" event={"ID":"88519f43-0eae-4aa3-8a29-b349816cd21c","Type":"ContainerDied","Data":"0d40ec967ce144ea2a9a67f3f6967aa340326e683c3d61ac8cee65babc5a5f04"} Dec 01 15:45:03 crc kubenswrapper[4931]: I1201 15:45:03.392564 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" Dec 01 15:45:03 crc kubenswrapper[4931]: I1201 15:45:03.496790 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88519f43-0eae-4aa3-8a29-b349816cd21c-secret-volume\") pod \"88519f43-0eae-4aa3-8a29-b349816cd21c\" (UID: \"88519f43-0eae-4aa3-8a29-b349816cd21c\") " Dec 01 15:45:03 crc kubenswrapper[4931]: I1201 15:45:03.496876 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88519f43-0eae-4aa3-8a29-b349816cd21c-config-volume\") pod \"88519f43-0eae-4aa3-8a29-b349816cd21c\" (UID: \"88519f43-0eae-4aa3-8a29-b349816cd21c\") " Dec 01 15:45:03 crc kubenswrapper[4931]: I1201 15:45:03.496907 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntcp2\" (UniqueName: \"kubernetes.io/projected/88519f43-0eae-4aa3-8a29-b349816cd21c-kube-api-access-ntcp2\") pod \"88519f43-0eae-4aa3-8a29-b349816cd21c\" (UID: \"88519f43-0eae-4aa3-8a29-b349816cd21c\") " Dec 01 15:45:03 crc kubenswrapper[4931]: I1201 15:45:03.498005 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88519f43-0eae-4aa3-8a29-b349816cd21c-config-volume" (OuterVolumeSpecName: "config-volume") pod "88519f43-0eae-4aa3-8a29-b349816cd21c" (UID: "88519f43-0eae-4aa3-8a29-b349816cd21c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:45:03 crc kubenswrapper[4931]: I1201 15:45:03.502641 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88519f43-0eae-4aa3-8a29-b349816cd21c-kube-api-access-ntcp2" (OuterVolumeSpecName: "kube-api-access-ntcp2") pod "88519f43-0eae-4aa3-8a29-b349816cd21c" (UID: "88519f43-0eae-4aa3-8a29-b349816cd21c"). InnerVolumeSpecName "kube-api-access-ntcp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:45:03 crc kubenswrapper[4931]: I1201 15:45:03.502735 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88519f43-0eae-4aa3-8a29-b349816cd21c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "88519f43-0eae-4aa3-8a29-b349816cd21c" (UID: "88519f43-0eae-4aa3-8a29-b349816cd21c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:45:03 crc kubenswrapper[4931]: I1201 15:45:03.599627 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/88519f43-0eae-4aa3-8a29-b349816cd21c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 15:45:03 crc kubenswrapper[4931]: I1201 15:45:03.599670 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntcp2\" (UniqueName: \"kubernetes.io/projected/88519f43-0eae-4aa3-8a29-b349816cd21c-kube-api-access-ntcp2\") on node \"crc\" DevicePath \"\"" Dec 01 15:45:03 crc kubenswrapper[4931]: I1201 15:45:03.599685 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88519f43-0eae-4aa3-8a29-b349816cd21c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 15:45:04 crc kubenswrapper[4931]: I1201 15:45:04.071529 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" event={"ID":"88519f43-0eae-4aa3-8a29-b349816cd21c","Type":"ContainerDied","Data":"17885a33470e13e9614ef2b2cbc375de58f29b340d03c654e46a8e0d7adec422"} Dec 01 15:45:04 crc kubenswrapper[4931]: I1201 15:45:04.071568 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17885a33470e13e9614ef2b2cbc375de58f29b340d03c654e46a8e0d7adec422" Dec 01 15:45:04 crc kubenswrapper[4931]: I1201 15:45:04.071615 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410065-cx99p" Dec 01 15:45:04 crc kubenswrapper[4931]: I1201 15:45:04.480258 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl"] Dec 01 15:45:04 crc kubenswrapper[4931]: I1201 15:45:04.490951 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410020-rqrgl"] Dec 01 15:45:06 crc kubenswrapper[4931]: I1201 15:45:06.257047 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d008c5dd-f44f-4509-b705-46b4c8819684" path="/var/lib/kubelet/pods/d008c5dd-f44f-4509-b705-46b4c8819684/volumes" Dec 01 15:45:14 crc kubenswrapper[4931]: I1201 15:45:14.177552 4931 generic.go:334] "Generic (PLEG): container finished" podID="90baac5e-c041-4bd1-bba8-b11e708370e7" containerID="885f824143c11c326c0ee9918526d9ef1c72a0a02161cb29cb0b987c3c2a207e" exitCode=0 Dec 01 15:45:14 crc kubenswrapper[4931]: I1201 15:45:14.178067 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" event={"ID":"90baac5e-c041-4bd1-bba8-b11e708370e7","Type":"ContainerDied","Data":"885f824143c11c326c0ee9918526d9ef1c72a0a02161cb29cb0b987c3c2a207e"} Dec 01 15:45:14 crc kubenswrapper[4931]: I1201 15:45:14.253299 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:45:14 crc kubenswrapper[4931]: E1201 15:45:14.253646 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.586417 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.751381 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-migration-ssh-key-0\") pod \"90baac5e-c041-4bd1-bba8-b11e708370e7\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.751447 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-ssh-key\") pod \"90baac5e-c041-4bd1-bba8-b11e708370e7\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.751500 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5zfk\" (UniqueName: \"kubernetes.io/projected/90baac5e-c041-4bd1-bba8-b11e708370e7-kube-api-access-v5zfk\") pod \"90baac5e-c041-4bd1-bba8-b11e708370e7\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.751544 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-cell1-compute-config-1\") pod \"90baac5e-c041-4bd1-bba8-b11e708370e7\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.751559 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-migration-ssh-key-1\") pod \"90baac5e-c041-4bd1-bba8-b11e708370e7\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.751624 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-cell1-compute-config-0\") pod \"90baac5e-c041-4bd1-bba8-b11e708370e7\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.751656 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-combined-ca-bundle\") pod \"90baac5e-c041-4bd1-bba8-b11e708370e7\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.751687 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-extra-config-0\") pod \"90baac5e-c041-4bd1-bba8-b11e708370e7\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.751763 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-inventory\") pod \"90baac5e-c041-4bd1-bba8-b11e708370e7\" (UID: \"90baac5e-c041-4bd1-bba8-b11e708370e7\") " Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.756514 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90baac5e-c041-4bd1-bba8-b11e708370e7-kube-api-access-v5zfk" (OuterVolumeSpecName: "kube-api-access-v5zfk") pod "90baac5e-c041-4bd1-bba8-b11e708370e7" (UID: "90baac5e-c041-4bd1-bba8-b11e708370e7"). InnerVolumeSpecName "kube-api-access-v5zfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.757221 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "90baac5e-c041-4bd1-bba8-b11e708370e7" (UID: "90baac5e-c041-4bd1-bba8-b11e708370e7"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.774582 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "90baac5e-c041-4bd1-bba8-b11e708370e7" (UID: "90baac5e-c041-4bd1-bba8-b11e708370e7"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.785799 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "90baac5e-c041-4bd1-bba8-b11e708370e7" (UID: "90baac5e-c041-4bd1-bba8-b11e708370e7"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.788763 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "90baac5e-c041-4bd1-bba8-b11e708370e7" (UID: "90baac5e-c041-4bd1-bba8-b11e708370e7"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.789583 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "90baac5e-c041-4bd1-bba8-b11e708370e7" (UID: "90baac5e-c041-4bd1-bba8-b11e708370e7"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.789609 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "90baac5e-c041-4bd1-bba8-b11e708370e7" (UID: "90baac5e-c041-4bd1-bba8-b11e708370e7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.789969 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "90baac5e-c041-4bd1-bba8-b11e708370e7" (UID: "90baac5e-c041-4bd1-bba8-b11e708370e7"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.795796 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-inventory" (OuterVolumeSpecName: "inventory") pod "90baac5e-c041-4bd1-bba8-b11e708370e7" (UID: "90baac5e-c041-4bd1-bba8-b11e708370e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.854630 4931 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.854681 4931 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.854694 4931 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.854709 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.854721 4931 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.854734 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.854746 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5zfk\" (UniqueName: \"kubernetes.io/projected/90baac5e-c041-4bd1-bba8-b11e708370e7-kube-api-access-v5zfk\") on node \"crc\" DevicePath \"\"" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.854758 4931 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 01 15:45:15 crc kubenswrapper[4931]: I1201 15:45:15.854774 4931 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/90baac5e-c041-4bd1-bba8-b11e708370e7-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.199367 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.199382 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qbgns" event={"ID":"90baac5e-c041-4bd1-bba8-b11e708370e7","Type":"ContainerDied","Data":"16da4afa9d6a1fe3b8b44d788b01e0c2907961d42c7bd8c9aea0ec3fcaebd17e"} Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.199778 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16da4afa9d6a1fe3b8b44d788b01e0c2907961d42c7bd8c9aea0ec3fcaebd17e" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.304231 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl"] Dec 01 15:45:16 crc kubenswrapper[4931]: E1201 15:45:16.304607 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90baac5e-c041-4bd1-bba8-b11e708370e7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.304628 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="90baac5e-c041-4bd1-bba8-b11e708370e7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 15:45:16 crc kubenswrapper[4931]: E1201 15:45:16.304651 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88519f43-0eae-4aa3-8a29-b349816cd21c" containerName="collect-profiles" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.304657 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="88519f43-0eae-4aa3-8a29-b349816cd21c" containerName="collect-profiles" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.304821 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="90baac5e-c041-4bd1-bba8-b11e708370e7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.304835 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="88519f43-0eae-4aa3-8a29-b349816cd21c" containerName="collect-profiles" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.305435 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.309741 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.310025 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wtv7w" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.310169 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.310289 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.310468 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.317847 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl"] Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.471956 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htrvt\" (UniqueName: \"kubernetes.io/projected/d89048c3-b9a3-4274-8d12-9543d8a29503-kube-api-access-htrvt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.472003 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.472176 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.472229 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.472367 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.472460 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.472499 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.574695 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htrvt\" (UniqueName: \"kubernetes.io/projected/d89048c3-b9a3-4274-8d12-9543d8a29503-kube-api-access-htrvt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.574740 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.574769 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.574794 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.574871 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.574895 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.574920 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.580002 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.580547 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.580628 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.581027 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.581421 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.582724 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.599270 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htrvt\" (UniqueName: \"kubernetes.io/projected/d89048c3-b9a3-4274-8d12-9543d8a29503-kube-api-access-htrvt\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2xthl\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:16 crc kubenswrapper[4931]: I1201 15:45:16.649052 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:45:17 crc kubenswrapper[4931]: I1201 15:45:17.242617 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl"] Dec 01 15:45:18 crc kubenswrapper[4931]: I1201 15:45:18.222178 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" event={"ID":"d89048c3-b9a3-4274-8d12-9543d8a29503","Type":"ContainerStarted","Data":"c103a17977d64ccb5c8a0501303eccd02c97baee1f5fc67c4260117a1f48e947"} Dec 01 15:45:18 crc kubenswrapper[4931]: I1201 15:45:18.222522 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" event={"ID":"d89048c3-b9a3-4274-8d12-9543d8a29503","Type":"ContainerStarted","Data":"6a25a3c11caef41c78698db608f41438953e83080464ae50b759a8de4e5ec6ba"} Dec 01 15:45:18 crc kubenswrapper[4931]: I1201 15:45:18.249517 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" podStartSLOduration=1.61992154 podStartE2EDuration="2.249490532s" podCreationTimestamp="2025-12-01 15:45:16 +0000 UTC" firstStartedPulling="2025-12-01 15:45:17.244371944 +0000 UTC m=+2663.670245651" lastFinishedPulling="2025-12-01 15:45:17.873940956 +0000 UTC m=+2664.299814643" observedRunningTime="2025-12-01 15:45:18.238143903 +0000 UTC m=+2664.664017570" watchObservedRunningTime="2025-12-01 15:45:18.249490532 +0000 UTC m=+2664.675364199" Dec 01 15:45:29 crc kubenswrapper[4931]: I1201 15:45:29.242099 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:45:29 crc kubenswrapper[4931]: E1201 15:45:29.242894 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:45:38 crc kubenswrapper[4931]: I1201 15:45:38.224168 4931 scope.go:117] "RemoveContainer" containerID="c592a428e1001ca7c30ec6909d8b6bb5d953909351bd5125d08ce329afa24f5f" Dec 01 15:45:41 crc kubenswrapper[4931]: I1201 15:45:41.241900 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:45:41 crc kubenswrapper[4931]: E1201 15:45:41.242601 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:45:55 crc kubenswrapper[4931]: I1201 15:45:55.241442 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:45:55 crc kubenswrapper[4931]: I1201 15:45:55.606318 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"640e688b1925e1462bde9a1b083909cc46a8e792843625e3db9aa4c1416d3fa2"} Dec 01 15:47:45 crc kubenswrapper[4931]: I1201 15:47:45.760132 4931 generic.go:334] "Generic (PLEG): container finished" podID="d89048c3-b9a3-4274-8d12-9543d8a29503" containerID="c103a17977d64ccb5c8a0501303eccd02c97baee1f5fc67c4260117a1f48e947" exitCode=0 Dec 01 15:47:45 crc kubenswrapper[4931]: I1201 15:47:45.760188 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" event={"ID":"d89048c3-b9a3-4274-8d12-9543d8a29503","Type":"ContainerDied","Data":"c103a17977d64ccb5c8a0501303eccd02c97baee1f5fc67c4260117a1f48e947"} Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.285618 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.364712 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-telemetry-combined-ca-bundle\") pod \"d89048c3-b9a3-4274-8d12-9543d8a29503\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.364753 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-0\") pod \"d89048c3-b9a3-4274-8d12-9543d8a29503\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.364781 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htrvt\" (UniqueName: \"kubernetes.io/projected/d89048c3-b9a3-4274-8d12-9543d8a29503-kube-api-access-htrvt\") pod \"d89048c3-b9a3-4274-8d12-9543d8a29503\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.364825 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-2\") pod \"d89048c3-b9a3-4274-8d12-9543d8a29503\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.364856 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-inventory\") pod \"d89048c3-b9a3-4274-8d12-9543d8a29503\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.364917 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ssh-key\") pod \"d89048c3-b9a3-4274-8d12-9543d8a29503\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.364956 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-1\") pod \"d89048c3-b9a3-4274-8d12-9543d8a29503\" (UID: \"d89048c3-b9a3-4274-8d12-9543d8a29503\") " Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.370761 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d89048c3-b9a3-4274-8d12-9543d8a29503" (UID: "d89048c3-b9a3-4274-8d12-9543d8a29503"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.371961 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d89048c3-b9a3-4274-8d12-9543d8a29503-kube-api-access-htrvt" (OuterVolumeSpecName: "kube-api-access-htrvt") pod "d89048c3-b9a3-4274-8d12-9543d8a29503" (UID: "d89048c3-b9a3-4274-8d12-9543d8a29503"). InnerVolumeSpecName "kube-api-access-htrvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.393424 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "d89048c3-b9a3-4274-8d12-9543d8a29503" (UID: "d89048c3-b9a3-4274-8d12-9543d8a29503"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.393995 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "d89048c3-b9a3-4274-8d12-9543d8a29503" (UID: "d89048c3-b9a3-4274-8d12-9543d8a29503"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.394567 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d89048c3-b9a3-4274-8d12-9543d8a29503" (UID: "d89048c3-b9a3-4274-8d12-9543d8a29503"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.397733 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "d89048c3-b9a3-4274-8d12-9543d8a29503" (UID: "d89048c3-b9a3-4274-8d12-9543d8a29503"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.404371 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-inventory" (OuterVolumeSpecName: "inventory") pod "d89048c3-b9a3-4274-8d12-9543d8a29503" (UID: "d89048c3-b9a3-4274-8d12-9543d8a29503"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.466400 4931 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.466435 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.466446 4931 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.466457 4931 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.466469 4931 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.466480 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htrvt\" (UniqueName: \"kubernetes.io/projected/d89048c3-b9a3-4274-8d12-9543d8a29503-kube-api-access-htrvt\") on node \"crc\" DevicePath \"\"" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.466490 4931 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d89048c3-b9a3-4274-8d12-9543d8a29503-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.812109 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" event={"ID":"d89048c3-b9a3-4274-8d12-9543d8a29503","Type":"ContainerDied","Data":"6a25a3c11caef41c78698db608f41438953e83080464ae50b759a8de4e5ec6ba"} Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.812801 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a25a3c11caef41c78698db608f41438953e83080464ae50b759a8de4e5ec6ba" Dec 01 15:47:47 crc kubenswrapper[4931]: I1201 15:47:47.812253 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2xthl" Dec 01 15:48:10 crc kubenswrapper[4931]: I1201 15:48:10.868481 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jsgw8"] Dec 01 15:48:10 crc kubenswrapper[4931]: E1201 15:48:10.869588 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89048c3-b9a3-4274-8d12-9543d8a29503" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 15:48:10 crc kubenswrapper[4931]: I1201 15:48:10.869605 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89048c3-b9a3-4274-8d12-9543d8a29503" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 15:48:10 crc kubenswrapper[4931]: I1201 15:48:10.869891 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="d89048c3-b9a3-4274-8d12-9543d8a29503" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 15:48:10 crc kubenswrapper[4931]: I1201 15:48:10.871913 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:10 crc kubenswrapper[4931]: I1201 15:48:10.883901 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsgw8"] Dec 01 15:48:10 crc kubenswrapper[4931]: I1201 15:48:10.992152 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-utilities\") pod \"redhat-marketplace-jsgw8\" (UID: \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\") " pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:10 crc kubenswrapper[4931]: I1201 15:48:10.992268 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-catalog-content\") pod \"redhat-marketplace-jsgw8\" (UID: \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\") " pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:10 crc kubenswrapper[4931]: I1201 15:48:10.992582 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlt7h\" (UniqueName: \"kubernetes.io/projected/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-kube-api-access-hlt7h\") pod \"redhat-marketplace-jsgw8\" (UID: \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\") " pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:11 crc kubenswrapper[4931]: I1201 15:48:11.094669 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-utilities\") pod \"redhat-marketplace-jsgw8\" (UID: \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\") " pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:11 crc kubenswrapper[4931]: I1201 15:48:11.094725 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-catalog-content\") pod \"redhat-marketplace-jsgw8\" (UID: \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\") " pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:11 crc kubenswrapper[4931]: I1201 15:48:11.094836 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlt7h\" (UniqueName: \"kubernetes.io/projected/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-kube-api-access-hlt7h\") pod \"redhat-marketplace-jsgw8\" (UID: \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\") " pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:11 crc kubenswrapper[4931]: I1201 15:48:11.095589 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-utilities\") pod \"redhat-marketplace-jsgw8\" (UID: \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\") " pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:11 crc kubenswrapper[4931]: I1201 15:48:11.095802 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-catalog-content\") pod \"redhat-marketplace-jsgw8\" (UID: \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\") " pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:11 crc kubenswrapper[4931]: I1201 15:48:11.112676 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlt7h\" (UniqueName: \"kubernetes.io/projected/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-kube-api-access-hlt7h\") pod \"redhat-marketplace-jsgw8\" (UID: \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\") " pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:11 crc kubenswrapper[4931]: I1201 15:48:11.193545 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:11 crc kubenswrapper[4931]: I1201 15:48:11.652074 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsgw8"] Dec 01 15:48:12 crc kubenswrapper[4931]: I1201 15:48:12.049666 4931 generic.go:334] "Generic (PLEG): container finished" podID="6ce18797-bbfa-44b6-9bb6-84aad7a922c9" containerID="e11fa1d26b25ccc799bc49ed20ff90ceb4f4fcca2ada7b18914b09dd80984f1a" exitCode=0 Dec 01 15:48:12 crc kubenswrapper[4931]: I1201 15:48:12.049724 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsgw8" event={"ID":"6ce18797-bbfa-44b6-9bb6-84aad7a922c9","Type":"ContainerDied","Data":"e11fa1d26b25ccc799bc49ed20ff90ceb4f4fcca2ada7b18914b09dd80984f1a"} Dec 01 15:48:12 crc kubenswrapper[4931]: I1201 15:48:12.049750 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsgw8" event={"ID":"6ce18797-bbfa-44b6-9bb6-84aad7a922c9","Type":"ContainerStarted","Data":"0c9b9863b6fc4bd36a6f6bd8f9fc8adf034b3b3bad89804f290bb0025b5dd222"} Dec 01 15:48:12 crc kubenswrapper[4931]: I1201 15:48:12.052749 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 15:48:13 crc kubenswrapper[4931]: I1201 15:48:13.059195 4931 generic.go:334] "Generic (PLEG): container finished" podID="6ce18797-bbfa-44b6-9bb6-84aad7a922c9" containerID="128d279e00e42ce16f22eb435a27cda2320028c276250c1017ca365fdefc2cd4" exitCode=0 Dec 01 15:48:13 crc kubenswrapper[4931]: I1201 15:48:13.059307 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsgw8" event={"ID":"6ce18797-bbfa-44b6-9bb6-84aad7a922c9","Type":"ContainerDied","Data":"128d279e00e42ce16f22eb435a27cda2320028c276250c1017ca365fdefc2cd4"} Dec 01 15:48:13 crc kubenswrapper[4931]: E1201 15:48:13.109508 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ce18797_bbfa_44b6_9bb6_84aad7a922c9.slice/crio-128d279e00e42ce16f22eb435a27cda2320028c276250c1017ca365fdefc2cd4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ce18797_bbfa_44b6_9bb6_84aad7a922c9.slice/crio-conmon-128d279e00e42ce16f22eb435a27cda2320028c276250c1017ca365fdefc2cd4.scope\": RecentStats: unable to find data in memory cache]" Dec 01 15:48:15 crc kubenswrapper[4931]: I1201 15:48:15.087041 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsgw8" event={"ID":"6ce18797-bbfa-44b6-9bb6-84aad7a922c9","Type":"ContainerStarted","Data":"d4385969e5942491c22302b0aeed695e0366ca437bccf35303980764558d5fc8"} Dec 01 15:48:15 crc kubenswrapper[4931]: I1201 15:48:15.113240 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jsgw8" podStartSLOduration=3.339071556 podStartE2EDuration="5.113226215s" podCreationTimestamp="2025-12-01 15:48:10 +0000 UTC" firstStartedPulling="2025-12-01 15:48:12.052437327 +0000 UTC m=+2838.478310984" lastFinishedPulling="2025-12-01 15:48:13.826591936 +0000 UTC m=+2840.252465643" observedRunningTime="2025-12-01 15:48:15.108220353 +0000 UTC m=+2841.534094020" watchObservedRunningTime="2025-12-01 15:48:15.113226215 +0000 UTC m=+2841.539099882" Dec 01 15:48:19 crc kubenswrapper[4931]: I1201 15:48:19.872263 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:48:19 crc kubenswrapper[4931]: I1201 15:48:19.872995 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:48:21 crc kubenswrapper[4931]: I1201 15:48:21.194880 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:21 crc kubenswrapper[4931]: I1201 15:48:21.195733 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:21 crc kubenswrapper[4931]: I1201 15:48:21.255832 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:22 crc kubenswrapper[4931]: I1201 15:48:22.204619 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:22 crc kubenswrapper[4931]: I1201 15:48:22.272428 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsgw8"] Dec 01 15:48:24 crc kubenswrapper[4931]: I1201 15:48:24.176870 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jsgw8" podUID="6ce18797-bbfa-44b6-9bb6-84aad7a922c9" containerName="registry-server" containerID="cri-o://d4385969e5942491c22302b0aeed695e0366ca437bccf35303980764558d5fc8" gracePeriod=2 Dec 01 15:48:24 crc kubenswrapper[4931]: I1201 15:48:24.638230 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:24 crc kubenswrapper[4931]: I1201 15:48:24.788823 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-utilities\") pod \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\" (UID: \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\") " Dec 01 15:48:24 crc kubenswrapper[4931]: I1201 15:48:24.789301 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlt7h\" (UniqueName: \"kubernetes.io/projected/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-kube-api-access-hlt7h\") pod \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\" (UID: \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\") " Dec 01 15:48:24 crc kubenswrapper[4931]: I1201 15:48:24.789334 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-catalog-content\") pod \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\" (UID: \"6ce18797-bbfa-44b6-9bb6-84aad7a922c9\") " Dec 01 15:48:24 crc kubenswrapper[4931]: I1201 15:48:24.789940 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-utilities" (OuterVolumeSpecName: "utilities") pod "6ce18797-bbfa-44b6-9bb6-84aad7a922c9" (UID: "6ce18797-bbfa-44b6-9bb6-84aad7a922c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:48:24 crc kubenswrapper[4931]: I1201 15:48:24.795100 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-kube-api-access-hlt7h" (OuterVolumeSpecName: "kube-api-access-hlt7h") pod "6ce18797-bbfa-44b6-9bb6-84aad7a922c9" (UID: "6ce18797-bbfa-44b6-9bb6-84aad7a922c9"). InnerVolumeSpecName "kube-api-access-hlt7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:48:24 crc kubenswrapper[4931]: I1201 15:48:24.808909 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ce18797-bbfa-44b6-9bb6-84aad7a922c9" (UID: "6ce18797-bbfa-44b6-9bb6-84aad7a922c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:48:24 crc kubenswrapper[4931]: I1201 15:48:24.919141 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlt7h\" (UniqueName: \"kubernetes.io/projected/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-kube-api-access-hlt7h\") on node \"crc\" DevicePath \"\"" Dec 01 15:48:24 crc kubenswrapper[4931]: I1201 15:48:24.919407 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:48:24 crc kubenswrapper[4931]: I1201 15:48:24.919531 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce18797-bbfa-44b6-9bb6-84aad7a922c9-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.186243 4931 generic.go:334] "Generic (PLEG): container finished" podID="6ce18797-bbfa-44b6-9bb6-84aad7a922c9" containerID="d4385969e5942491c22302b0aeed695e0366ca437bccf35303980764558d5fc8" exitCode=0 Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.186290 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsgw8" event={"ID":"6ce18797-bbfa-44b6-9bb6-84aad7a922c9","Type":"ContainerDied","Data":"d4385969e5942491c22302b0aeed695e0366ca437bccf35303980764558d5fc8"} Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.186340 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsgw8" event={"ID":"6ce18797-bbfa-44b6-9bb6-84aad7a922c9","Type":"ContainerDied","Data":"0c9b9863b6fc4bd36a6f6bd8f9fc8adf034b3b3bad89804f290bb0025b5dd222"} Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.186361 4931 scope.go:117] "RemoveContainer" containerID="d4385969e5942491c22302b0aeed695e0366ca437bccf35303980764558d5fc8" Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.187489 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsgw8" Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.225071 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsgw8"] Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.227891 4931 scope.go:117] "RemoveContainer" containerID="128d279e00e42ce16f22eb435a27cda2320028c276250c1017ca365fdefc2cd4" Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.236576 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsgw8"] Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.251953 4931 scope.go:117] "RemoveContainer" containerID="e11fa1d26b25ccc799bc49ed20ff90ceb4f4fcca2ada7b18914b09dd80984f1a" Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.288458 4931 scope.go:117] "RemoveContainer" containerID="d4385969e5942491c22302b0aeed695e0366ca437bccf35303980764558d5fc8" Dec 01 15:48:25 crc kubenswrapper[4931]: E1201 15:48:25.289242 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4385969e5942491c22302b0aeed695e0366ca437bccf35303980764558d5fc8\": container with ID starting with d4385969e5942491c22302b0aeed695e0366ca437bccf35303980764558d5fc8 not found: ID does not exist" containerID="d4385969e5942491c22302b0aeed695e0366ca437bccf35303980764558d5fc8" Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.289312 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4385969e5942491c22302b0aeed695e0366ca437bccf35303980764558d5fc8"} err="failed to get container status \"d4385969e5942491c22302b0aeed695e0366ca437bccf35303980764558d5fc8\": rpc error: code = NotFound desc = could not find container \"d4385969e5942491c22302b0aeed695e0366ca437bccf35303980764558d5fc8\": container with ID starting with d4385969e5942491c22302b0aeed695e0366ca437bccf35303980764558d5fc8 not found: ID does not exist" Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.289465 4931 scope.go:117] "RemoveContainer" containerID="128d279e00e42ce16f22eb435a27cda2320028c276250c1017ca365fdefc2cd4" Dec 01 15:48:25 crc kubenswrapper[4931]: E1201 15:48:25.289976 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128d279e00e42ce16f22eb435a27cda2320028c276250c1017ca365fdefc2cd4\": container with ID starting with 128d279e00e42ce16f22eb435a27cda2320028c276250c1017ca365fdefc2cd4 not found: ID does not exist" containerID="128d279e00e42ce16f22eb435a27cda2320028c276250c1017ca365fdefc2cd4" Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.290019 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128d279e00e42ce16f22eb435a27cda2320028c276250c1017ca365fdefc2cd4"} err="failed to get container status \"128d279e00e42ce16f22eb435a27cda2320028c276250c1017ca365fdefc2cd4\": rpc error: code = NotFound desc = could not find container \"128d279e00e42ce16f22eb435a27cda2320028c276250c1017ca365fdefc2cd4\": container with ID starting with 128d279e00e42ce16f22eb435a27cda2320028c276250c1017ca365fdefc2cd4 not found: ID does not exist" Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.290049 4931 scope.go:117] "RemoveContainer" containerID="e11fa1d26b25ccc799bc49ed20ff90ceb4f4fcca2ada7b18914b09dd80984f1a" Dec 01 15:48:25 crc kubenswrapper[4931]: E1201 15:48:25.290683 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e11fa1d26b25ccc799bc49ed20ff90ceb4f4fcca2ada7b18914b09dd80984f1a\": container with ID starting with e11fa1d26b25ccc799bc49ed20ff90ceb4f4fcca2ada7b18914b09dd80984f1a not found: ID does not exist" containerID="e11fa1d26b25ccc799bc49ed20ff90ceb4f4fcca2ada7b18914b09dd80984f1a" Dec 01 15:48:25 crc kubenswrapper[4931]: I1201 15:48:25.290702 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e11fa1d26b25ccc799bc49ed20ff90ceb4f4fcca2ada7b18914b09dd80984f1a"} err="failed to get container status \"e11fa1d26b25ccc799bc49ed20ff90ceb4f4fcca2ada7b18914b09dd80984f1a\": rpc error: code = NotFound desc = could not find container \"e11fa1d26b25ccc799bc49ed20ff90ceb4f4fcca2ada7b18914b09dd80984f1a\": container with ID starting with e11fa1d26b25ccc799bc49ed20ff90ceb4f4fcca2ada7b18914b09dd80984f1a not found: ID does not exist" Dec 01 15:48:26 crc kubenswrapper[4931]: I1201 15:48:26.253361 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ce18797-bbfa-44b6-9bb6-84aad7a922c9" path="/var/lib/kubelet/pods/6ce18797-bbfa-44b6-9bb6-84aad7a922c9/volumes" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.251294 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 15:48:31 crc kubenswrapper[4931]: E1201 15:48:31.252656 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce18797-bbfa-44b6-9bb6-84aad7a922c9" containerName="extract-content" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.252684 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce18797-bbfa-44b6-9bb6-84aad7a922c9" containerName="extract-content" Dec 01 15:48:31 crc kubenswrapper[4931]: E1201 15:48:31.252716 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce18797-bbfa-44b6-9bb6-84aad7a922c9" containerName="registry-server" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.252729 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce18797-bbfa-44b6-9bb6-84aad7a922c9" containerName="registry-server" Dec 01 15:48:31 crc kubenswrapper[4931]: E1201 15:48:31.252756 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce18797-bbfa-44b6-9bb6-84aad7a922c9" containerName="extract-utilities" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.252771 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce18797-bbfa-44b6-9bb6-84aad7a922c9" containerName="extract-utilities" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.253180 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ce18797-bbfa-44b6-9bb6-84aad7a922c9" containerName="registry-server" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.254863 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.257507 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.257733 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-k2668" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.263866 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.264257 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.267660 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.343273 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.343314 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e384c534-76cd-4296-9318-aaf007e87661-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.343420 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.343447 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e384c534-76cd-4296-9318-aaf007e87661-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.343528 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e384c534-76cd-4296-9318-aaf007e87661-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.343552 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs2fc\" (UniqueName: \"kubernetes.io/projected/e384c534-76cd-4296-9318-aaf007e87661-kube-api-access-hs2fc\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.343744 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.343801 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.343832 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e384c534-76cd-4296-9318-aaf007e87661-config-data\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.445604 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e384c534-76cd-4296-9318-aaf007e87661-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.445649 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs2fc\" (UniqueName: \"kubernetes.io/projected/e384c534-76cd-4296-9318-aaf007e87661-kube-api-access-hs2fc\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.445714 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.445753 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.445776 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e384c534-76cd-4296-9318-aaf007e87661-config-data\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.445844 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.445864 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e384c534-76cd-4296-9318-aaf007e87661-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.445890 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.445910 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e384c534-76cd-4296-9318-aaf007e87661-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.446834 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e384c534-76cd-4296-9318-aaf007e87661-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.446962 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e384c534-76cd-4296-9318-aaf007e87661-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.447034 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.448167 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e384c534-76cd-4296-9318-aaf007e87661-config-data\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.448355 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e384c534-76cd-4296-9318-aaf007e87661-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.452657 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.452914 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.455702 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.463251 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs2fc\" (UniqueName: \"kubernetes.io/projected/e384c534-76cd-4296-9318-aaf007e87661-kube-api-access-hs2fc\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.484783 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " pod="openstack/tempest-tests-tempest" Dec 01 15:48:31 crc kubenswrapper[4931]: I1201 15:48:31.587945 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 15:48:32 crc kubenswrapper[4931]: I1201 15:48:32.086437 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 15:48:32 crc kubenswrapper[4931]: I1201 15:48:32.262460 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e384c534-76cd-4296-9318-aaf007e87661","Type":"ContainerStarted","Data":"f4720224af89e27fed09cb6d1d10ba4090016053cb0b6964586884d28160a677"} Dec 01 15:48:49 crc kubenswrapper[4931]: I1201 15:48:49.872041 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:48:49 crc kubenswrapper[4931]: I1201 15:48:49.872644 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:48:58 crc kubenswrapper[4931]: E1201 15:48:58.772649 4931 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 01 15:48:58 crc kubenswrapper[4931]: E1201 15:48:58.773221 4931 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hs2fc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(e384c534-76cd-4296-9318-aaf007e87661): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 15:48:58 crc kubenswrapper[4931]: E1201 15:48:58.774503 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="e384c534-76cd-4296-9318-aaf007e87661" Dec 01 15:48:59 crc kubenswrapper[4931]: E1201 15:48:59.545498 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="e384c534-76cd-4296-9318-aaf007e87661" Dec 01 15:49:16 crc kubenswrapper[4931]: I1201 15:49:16.538891 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 15:49:17 crc kubenswrapper[4931]: I1201 15:49:17.747671 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e384c534-76cd-4296-9318-aaf007e87661","Type":"ContainerStarted","Data":"989bb3bf58f18d976d1b7fa55f857735e91cbd076adbe6baa653e74c28d5e877"} Dec 01 15:49:17 crc kubenswrapper[4931]: I1201 15:49:17.779431 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.333947799 podStartE2EDuration="47.779410662s" podCreationTimestamp="2025-12-01 15:48:30 +0000 UTC" firstStartedPulling="2025-12-01 15:48:32.089293153 +0000 UTC m=+2858.515166840" lastFinishedPulling="2025-12-01 15:49:16.534755996 +0000 UTC m=+2902.960629703" observedRunningTime="2025-12-01 15:49:17.76983282 +0000 UTC m=+2904.195706507" watchObservedRunningTime="2025-12-01 15:49:17.779410662 +0000 UTC m=+2904.205284339" Dec 01 15:49:19 crc kubenswrapper[4931]: I1201 15:49:19.872519 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:49:19 crc kubenswrapper[4931]: I1201 15:49:19.872876 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:49:19 crc kubenswrapper[4931]: I1201 15:49:19.872924 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:49:19 crc kubenswrapper[4931]: I1201 15:49:19.873894 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"640e688b1925e1462bde9a1b083909cc46a8e792843625e3db9aa4c1416d3fa2"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 15:49:19 crc kubenswrapper[4931]: I1201 15:49:19.873954 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://640e688b1925e1462bde9a1b083909cc46a8e792843625e3db9aa4c1416d3fa2" gracePeriod=600 Dec 01 15:49:20 crc kubenswrapper[4931]: I1201 15:49:20.781021 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="640e688b1925e1462bde9a1b083909cc46a8e792843625e3db9aa4c1416d3fa2" exitCode=0 Dec 01 15:49:20 crc kubenswrapper[4931]: I1201 15:49:20.781080 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"640e688b1925e1462bde9a1b083909cc46a8e792843625e3db9aa4c1416d3fa2"} Dec 01 15:49:20 crc kubenswrapper[4931]: I1201 15:49:20.781557 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46"} Dec 01 15:49:20 crc kubenswrapper[4931]: I1201 15:49:20.781577 4931 scope.go:117] "RemoveContainer" containerID="214c421ce1616af7cfe048dc34d1d82d3c6ef060678028a44bdf0cd2883f6700" Dec 01 15:50:21 crc kubenswrapper[4931]: I1201 15:50:21.309071 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f7mf8"] Dec 01 15:50:21 crc kubenswrapper[4931]: I1201 15:50:21.311836 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:21 crc kubenswrapper[4931]: I1201 15:50:21.321753 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7mf8"] Dec 01 15:50:21 crc kubenswrapper[4931]: I1201 15:50:21.424157 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18746cd6-b301-4a32-8231-f198277567b3-utilities\") pod \"community-operators-f7mf8\" (UID: \"18746cd6-b301-4a32-8231-f198277567b3\") " pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:21 crc kubenswrapper[4931]: I1201 15:50:21.424306 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18746cd6-b301-4a32-8231-f198277567b3-catalog-content\") pod \"community-operators-f7mf8\" (UID: \"18746cd6-b301-4a32-8231-f198277567b3\") " pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:21 crc kubenswrapper[4931]: I1201 15:50:21.424337 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmcnc\" (UniqueName: \"kubernetes.io/projected/18746cd6-b301-4a32-8231-f198277567b3-kube-api-access-vmcnc\") pod \"community-operators-f7mf8\" (UID: \"18746cd6-b301-4a32-8231-f198277567b3\") " pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:21 crc kubenswrapper[4931]: I1201 15:50:21.526516 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18746cd6-b301-4a32-8231-f198277567b3-utilities\") pod \"community-operators-f7mf8\" (UID: \"18746cd6-b301-4a32-8231-f198277567b3\") " pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:21 crc kubenswrapper[4931]: I1201 15:50:21.526668 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18746cd6-b301-4a32-8231-f198277567b3-catalog-content\") pod \"community-operators-f7mf8\" (UID: \"18746cd6-b301-4a32-8231-f198277567b3\") " pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:21 crc kubenswrapper[4931]: I1201 15:50:21.526700 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmcnc\" (UniqueName: \"kubernetes.io/projected/18746cd6-b301-4a32-8231-f198277567b3-kube-api-access-vmcnc\") pod \"community-operators-f7mf8\" (UID: \"18746cd6-b301-4a32-8231-f198277567b3\") " pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:21 crc kubenswrapper[4931]: I1201 15:50:21.526981 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18746cd6-b301-4a32-8231-f198277567b3-utilities\") pod \"community-operators-f7mf8\" (UID: \"18746cd6-b301-4a32-8231-f198277567b3\") " pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:21 crc kubenswrapper[4931]: I1201 15:50:21.527042 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18746cd6-b301-4a32-8231-f198277567b3-catalog-content\") pod \"community-operators-f7mf8\" (UID: \"18746cd6-b301-4a32-8231-f198277567b3\") " pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:21 crc kubenswrapper[4931]: I1201 15:50:21.548661 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmcnc\" (UniqueName: \"kubernetes.io/projected/18746cd6-b301-4a32-8231-f198277567b3-kube-api-access-vmcnc\") pod \"community-operators-f7mf8\" (UID: \"18746cd6-b301-4a32-8231-f198277567b3\") " pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:21 crc kubenswrapper[4931]: I1201 15:50:21.640434 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.163630 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f7mf8"] Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.307021 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xcdhx"] Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.308951 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.319750 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xcdhx"] Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.372720 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7mf8" event={"ID":"18746cd6-b301-4a32-8231-f198277567b3","Type":"ContainerStarted","Data":"740e5ea3110a40fbd2b0b7af27339472bfaf21307a205f0a2381e447bd5c65ea"} Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.441596 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8hbv\" (UniqueName: \"kubernetes.io/projected/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-kube-api-access-k8hbv\") pod \"redhat-operators-xcdhx\" (UID: \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\") " pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.441686 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-catalog-content\") pod \"redhat-operators-xcdhx\" (UID: \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\") " pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.441812 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-utilities\") pod \"redhat-operators-xcdhx\" (UID: \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\") " pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.543768 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-catalog-content\") pod \"redhat-operators-xcdhx\" (UID: \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\") " pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.543915 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-utilities\") pod \"redhat-operators-xcdhx\" (UID: \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\") " pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.543965 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8hbv\" (UniqueName: \"kubernetes.io/projected/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-kube-api-access-k8hbv\") pod \"redhat-operators-xcdhx\" (UID: \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\") " pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.544654 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-catalog-content\") pod \"redhat-operators-xcdhx\" (UID: \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\") " pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.544657 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-utilities\") pod \"redhat-operators-xcdhx\" (UID: \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\") " pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.572183 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8hbv\" (UniqueName: \"kubernetes.io/projected/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-kube-api-access-k8hbv\") pod \"redhat-operators-xcdhx\" (UID: \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\") " pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:22 crc kubenswrapper[4931]: I1201 15:50:22.657056 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.111169 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xcdhx"] Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.381947 4931 generic.go:334] "Generic (PLEG): container finished" podID="ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" containerID="11324dc4848c8a9b56c189108f7d6582bb9124b8f315424e2b1a400a5ff89820" exitCode=0 Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.382047 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcdhx" event={"ID":"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8","Type":"ContainerDied","Data":"11324dc4848c8a9b56c189108f7d6582bb9124b8f315424e2b1a400a5ff89820"} Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.382078 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcdhx" event={"ID":"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8","Type":"ContainerStarted","Data":"079ee3ede28f187315ae06bce359ab376e3144b54047c4441764bde9fc23091f"} Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.383831 4931 generic.go:334] "Generic (PLEG): container finished" podID="18746cd6-b301-4a32-8231-f198277567b3" containerID="dfbdbb226a4e5164b0d9b0e549962b01d6cdf5041adecb600c5783bc04477890" exitCode=0 Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.383863 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7mf8" event={"ID":"18746cd6-b301-4a32-8231-f198277567b3","Type":"ContainerDied","Data":"dfbdbb226a4e5164b0d9b0e549962b01d6cdf5041adecb600c5783bc04477890"} Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.705135 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-crtqf"] Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.707031 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.718653 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-crtqf"] Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.870300 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013acb63-d2dd-440c-b71d-7489a0c704eb-utilities\") pod \"certified-operators-crtqf\" (UID: \"013acb63-d2dd-440c-b71d-7489a0c704eb\") " pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.870463 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qbrc\" (UniqueName: \"kubernetes.io/projected/013acb63-d2dd-440c-b71d-7489a0c704eb-kube-api-access-8qbrc\") pod \"certified-operators-crtqf\" (UID: \"013acb63-d2dd-440c-b71d-7489a0c704eb\") " pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.870543 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013acb63-d2dd-440c-b71d-7489a0c704eb-catalog-content\") pod \"certified-operators-crtqf\" (UID: \"013acb63-d2dd-440c-b71d-7489a0c704eb\") " pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.972152 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013acb63-d2dd-440c-b71d-7489a0c704eb-catalog-content\") pod \"certified-operators-crtqf\" (UID: \"013acb63-d2dd-440c-b71d-7489a0c704eb\") " pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.972247 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013acb63-d2dd-440c-b71d-7489a0c704eb-utilities\") pod \"certified-operators-crtqf\" (UID: \"013acb63-d2dd-440c-b71d-7489a0c704eb\") " pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.972399 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qbrc\" (UniqueName: \"kubernetes.io/projected/013acb63-d2dd-440c-b71d-7489a0c704eb-kube-api-access-8qbrc\") pod \"certified-operators-crtqf\" (UID: \"013acb63-d2dd-440c-b71d-7489a0c704eb\") " pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.973367 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013acb63-d2dd-440c-b71d-7489a0c704eb-catalog-content\") pod \"certified-operators-crtqf\" (UID: \"013acb63-d2dd-440c-b71d-7489a0c704eb\") " pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:23 crc kubenswrapper[4931]: I1201 15:50:23.973673 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013acb63-d2dd-440c-b71d-7489a0c704eb-utilities\") pod \"certified-operators-crtqf\" (UID: \"013acb63-d2dd-440c-b71d-7489a0c704eb\") " pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:24 crc kubenswrapper[4931]: I1201 15:50:24.011951 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qbrc\" (UniqueName: \"kubernetes.io/projected/013acb63-d2dd-440c-b71d-7489a0c704eb-kube-api-access-8qbrc\") pod \"certified-operators-crtqf\" (UID: \"013acb63-d2dd-440c-b71d-7489a0c704eb\") " pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:24 crc kubenswrapper[4931]: I1201 15:50:24.024576 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:24 crc kubenswrapper[4931]: I1201 15:50:24.524092 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-crtqf"] Dec 01 15:50:24 crc kubenswrapper[4931]: W1201 15:50:24.545545 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod013acb63_d2dd_440c_b71d_7489a0c704eb.slice/crio-489faae74c18a1ab6e8b804c794faa483c4013c4912cc206bc653cbac4cfce06 WatchSource:0}: Error finding container 489faae74c18a1ab6e8b804c794faa483c4013c4912cc206bc653cbac4cfce06: Status 404 returned error can't find the container with id 489faae74c18a1ab6e8b804c794faa483c4013c4912cc206bc653cbac4cfce06 Dec 01 15:50:25 crc kubenswrapper[4931]: I1201 15:50:25.403675 4931 generic.go:334] "Generic (PLEG): container finished" podID="18746cd6-b301-4a32-8231-f198277567b3" containerID="6ec296b7ea77ed53757f81b44e9d0b90d57fc70de1e0392bc6d7f9826079e169" exitCode=0 Dec 01 15:50:25 crc kubenswrapper[4931]: I1201 15:50:25.403891 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7mf8" event={"ID":"18746cd6-b301-4a32-8231-f198277567b3","Type":"ContainerDied","Data":"6ec296b7ea77ed53757f81b44e9d0b90d57fc70de1e0392bc6d7f9826079e169"} Dec 01 15:50:25 crc kubenswrapper[4931]: I1201 15:50:25.406326 4931 generic.go:334] "Generic (PLEG): container finished" podID="013acb63-d2dd-440c-b71d-7489a0c704eb" containerID="6ad762cf91d4a7fef03270d149496b4a89b4802304469377929aee1aac333e5d" exitCode=0 Dec 01 15:50:25 crc kubenswrapper[4931]: I1201 15:50:25.406410 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crtqf" event={"ID":"013acb63-d2dd-440c-b71d-7489a0c704eb","Type":"ContainerDied","Data":"6ad762cf91d4a7fef03270d149496b4a89b4802304469377929aee1aac333e5d"} Dec 01 15:50:25 crc kubenswrapper[4931]: I1201 15:50:25.406436 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crtqf" event={"ID":"013acb63-d2dd-440c-b71d-7489a0c704eb","Type":"ContainerStarted","Data":"489faae74c18a1ab6e8b804c794faa483c4013c4912cc206bc653cbac4cfce06"} Dec 01 15:50:25 crc kubenswrapper[4931]: I1201 15:50:25.409154 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcdhx" event={"ID":"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8","Type":"ContainerStarted","Data":"0586b1cbd830f994ced3d1ef0a70bd38586e984e9b2da3638990b3e906ef2c3e"} Dec 01 15:50:27 crc kubenswrapper[4931]: I1201 15:50:27.428246 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7mf8" event={"ID":"18746cd6-b301-4a32-8231-f198277567b3","Type":"ContainerStarted","Data":"ab448d64b3c93fc58d75aa33acc1e85bc405d5f6b7e34562191ecf677d13832f"} Dec 01 15:50:27 crc kubenswrapper[4931]: I1201 15:50:27.429841 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crtqf" event={"ID":"013acb63-d2dd-440c-b71d-7489a0c704eb","Type":"ContainerStarted","Data":"db34cdcf90dd698f87b00ecbbe74287817d8441f64d1032a79ee16c8beca65dd"} Dec 01 15:50:27 crc kubenswrapper[4931]: I1201 15:50:27.431561 4931 generic.go:334] "Generic (PLEG): container finished" podID="ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" containerID="0586b1cbd830f994ced3d1ef0a70bd38586e984e9b2da3638990b3e906ef2c3e" exitCode=0 Dec 01 15:50:27 crc kubenswrapper[4931]: I1201 15:50:27.431592 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcdhx" event={"ID":"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8","Type":"ContainerDied","Data":"0586b1cbd830f994ced3d1ef0a70bd38586e984e9b2da3638990b3e906ef2c3e"} Dec 01 15:50:27 crc kubenswrapper[4931]: I1201 15:50:27.457407 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f7mf8" podStartSLOduration=3.9127688259999998 podStartE2EDuration="6.457368548s" podCreationTimestamp="2025-12-01 15:50:21 +0000 UTC" firstStartedPulling="2025-12-01 15:50:23.385614189 +0000 UTC m=+2969.811487856" lastFinishedPulling="2025-12-01 15:50:25.930213911 +0000 UTC m=+2972.356087578" observedRunningTime="2025-12-01 15:50:27.452239402 +0000 UTC m=+2973.878113089" watchObservedRunningTime="2025-12-01 15:50:27.457368548 +0000 UTC m=+2973.883242215" Dec 01 15:50:29 crc kubenswrapper[4931]: I1201 15:50:29.454891 4931 generic.go:334] "Generic (PLEG): container finished" podID="013acb63-d2dd-440c-b71d-7489a0c704eb" containerID="db34cdcf90dd698f87b00ecbbe74287817d8441f64d1032a79ee16c8beca65dd" exitCode=0 Dec 01 15:50:29 crc kubenswrapper[4931]: I1201 15:50:29.454986 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crtqf" event={"ID":"013acb63-d2dd-440c-b71d-7489a0c704eb","Type":"ContainerDied","Data":"db34cdcf90dd698f87b00ecbbe74287817d8441f64d1032a79ee16c8beca65dd"} Dec 01 15:50:29 crc kubenswrapper[4931]: I1201 15:50:29.459747 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcdhx" event={"ID":"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8","Type":"ContainerStarted","Data":"13227e7188d76204875f99b109f0b3bf550eb771ababebf362b715c9669b6ad6"} Dec 01 15:50:29 crc kubenswrapper[4931]: I1201 15:50:29.500227 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xcdhx" podStartSLOduration=2.494303006 podStartE2EDuration="7.499766522s" podCreationTimestamp="2025-12-01 15:50:22 +0000 UTC" firstStartedPulling="2025-12-01 15:50:23.383419747 +0000 UTC m=+2969.809293414" lastFinishedPulling="2025-12-01 15:50:28.388883263 +0000 UTC m=+2974.814756930" observedRunningTime="2025-12-01 15:50:29.494846892 +0000 UTC m=+2975.920720579" watchObservedRunningTime="2025-12-01 15:50:29.499766522 +0000 UTC m=+2975.925640189" Dec 01 15:50:30 crc kubenswrapper[4931]: I1201 15:50:30.472031 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crtqf" event={"ID":"013acb63-d2dd-440c-b71d-7489a0c704eb","Type":"ContainerStarted","Data":"7eb8e36e449f26e6d00e0dab78fa3e4fea0c0d80597026f287e7484abc6a5852"} Dec 01 15:50:30 crc kubenswrapper[4931]: I1201 15:50:30.498257 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-crtqf" podStartSLOduration=3.041119469 podStartE2EDuration="7.498237349s" podCreationTimestamp="2025-12-01 15:50:23 +0000 UTC" firstStartedPulling="2025-12-01 15:50:25.408429307 +0000 UTC m=+2971.834302974" lastFinishedPulling="2025-12-01 15:50:29.865547187 +0000 UTC m=+2976.291420854" observedRunningTime="2025-12-01 15:50:30.490557061 +0000 UTC m=+2976.916430778" watchObservedRunningTime="2025-12-01 15:50:30.498237349 +0000 UTC m=+2976.924111026" Dec 01 15:50:31 crc kubenswrapper[4931]: I1201 15:50:31.785968 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:31 crc kubenswrapper[4931]: I1201 15:50:31.786242 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:31 crc kubenswrapper[4931]: I1201 15:50:31.844634 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:32 crc kubenswrapper[4931]: I1201 15:50:32.532845 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:32 crc kubenswrapper[4931]: I1201 15:50:32.657579 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:32 crc kubenswrapper[4931]: I1201 15:50:32.657719 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:33 crc kubenswrapper[4931]: I1201 15:50:33.699572 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xcdhx" podUID="ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" containerName="registry-server" probeResult="failure" output=< Dec 01 15:50:33 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Dec 01 15:50:33 crc kubenswrapper[4931]: > Dec 01 15:50:34 crc kubenswrapper[4931]: I1201 15:50:34.025364 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:34 crc kubenswrapper[4931]: I1201 15:50:34.025778 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:34 crc kubenswrapper[4931]: I1201 15:50:34.076880 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:34 crc kubenswrapper[4931]: I1201 15:50:34.099792 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7mf8"] Dec 01 15:50:34 crc kubenswrapper[4931]: I1201 15:50:34.506670 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f7mf8" podUID="18746cd6-b301-4a32-8231-f198277567b3" containerName="registry-server" containerID="cri-o://ab448d64b3c93fc58d75aa33acc1e85bc405d5f6b7e34562191ecf677d13832f" gracePeriod=2 Dec 01 15:50:34 crc kubenswrapper[4931]: I1201 15:50:34.962361 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.043316 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18746cd6-b301-4a32-8231-f198277567b3-catalog-content\") pod \"18746cd6-b301-4a32-8231-f198277567b3\" (UID: \"18746cd6-b301-4a32-8231-f198277567b3\") " Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.043500 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmcnc\" (UniqueName: \"kubernetes.io/projected/18746cd6-b301-4a32-8231-f198277567b3-kube-api-access-vmcnc\") pod \"18746cd6-b301-4a32-8231-f198277567b3\" (UID: \"18746cd6-b301-4a32-8231-f198277567b3\") " Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.043543 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18746cd6-b301-4a32-8231-f198277567b3-utilities\") pod \"18746cd6-b301-4a32-8231-f198277567b3\" (UID: \"18746cd6-b301-4a32-8231-f198277567b3\") " Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.044509 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18746cd6-b301-4a32-8231-f198277567b3-utilities" (OuterVolumeSpecName: "utilities") pod "18746cd6-b301-4a32-8231-f198277567b3" (UID: "18746cd6-b301-4a32-8231-f198277567b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.056419 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18746cd6-b301-4a32-8231-f198277567b3-kube-api-access-vmcnc" (OuterVolumeSpecName: "kube-api-access-vmcnc") pod "18746cd6-b301-4a32-8231-f198277567b3" (UID: "18746cd6-b301-4a32-8231-f198277567b3"). InnerVolumeSpecName "kube-api-access-vmcnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.103564 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18746cd6-b301-4a32-8231-f198277567b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18746cd6-b301-4a32-8231-f198277567b3" (UID: "18746cd6-b301-4a32-8231-f198277567b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.145737 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmcnc\" (UniqueName: \"kubernetes.io/projected/18746cd6-b301-4a32-8231-f198277567b3-kube-api-access-vmcnc\") on node \"crc\" DevicePath \"\"" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.145767 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18746cd6-b301-4a32-8231-f198277567b3-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.145777 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18746cd6-b301-4a32-8231-f198277567b3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.516396 4931 generic.go:334] "Generic (PLEG): container finished" podID="18746cd6-b301-4a32-8231-f198277567b3" containerID="ab448d64b3c93fc58d75aa33acc1e85bc405d5f6b7e34562191ecf677d13832f" exitCode=0 Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.516458 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7mf8" event={"ID":"18746cd6-b301-4a32-8231-f198277567b3","Type":"ContainerDied","Data":"ab448d64b3c93fc58d75aa33acc1e85bc405d5f6b7e34562191ecf677d13832f"} Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.517118 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f7mf8" event={"ID":"18746cd6-b301-4a32-8231-f198277567b3","Type":"ContainerDied","Data":"740e5ea3110a40fbd2b0b7af27339472bfaf21307a205f0a2381e447bd5c65ea"} Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.517152 4931 scope.go:117] "RemoveContainer" containerID="ab448d64b3c93fc58d75aa33acc1e85bc405d5f6b7e34562191ecf677d13832f" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.516491 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f7mf8" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.537673 4931 scope.go:117] "RemoveContainer" containerID="6ec296b7ea77ed53757f81b44e9d0b90d57fc70de1e0392bc6d7f9826079e169" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.557571 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f7mf8"] Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.569236 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f7mf8"] Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.577236 4931 scope.go:117] "RemoveContainer" containerID="dfbdbb226a4e5164b0d9b0e549962b01d6cdf5041adecb600c5783bc04477890" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.587212 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.617346 4931 scope.go:117] "RemoveContainer" containerID="ab448d64b3c93fc58d75aa33acc1e85bc405d5f6b7e34562191ecf677d13832f" Dec 01 15:50:35 crc kubenswrapper[4931]: E1201 15:50:35.617853 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab448d64b3c93fc58d75aa33acc1e85bc405d5f6b7e34562191ecf677d13832f\": container with ID starting with ab448d64b3c93fc58d75aa33acc1e85bc405d5f6b7e34562191ecf677d13832f not found: ID does not exist" containerID="ab448d64b3c93fc58d75aa33acc1e85bc405d5f6b7e34562191ecf677d13832f" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.617895 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab448d64b3c93fc58d75aa33acc1e85bc405d5f6b7e34562191ecf677d13832f"} err="failed to get container status \"ab448d64b3c93fc58d75aa33acc1e85bc405d5f6b7e34562191ecf677d13832f\": rpc error: code = NotFound desc = could not find container \"ab448d64b3c93fc58d75aa33acc1e85bc405d5f6b7e34562191ecf677d13832f\": container with ID starting with ab448d64b3c93fc58d75aa33acc1e85bc405d5f6b7e34562191ecf677d13832f not found: ID does not exist" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.617923 4931 scope.go:117] "RemoveContainer" containerID="6ec296b7ea77ed53757f81b44e9d0b90d57fc70de1e0392bc6d7f9826079e169" Dec 01 15:50:35 crc kubenswrapper[4931]: E1201 15:50:35.618210 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ec296b7ea77ed53757f81b44e9d0b90d57fc70de1e0392bc6d7f9826079e169\": container with ID starting with 6ec296b7ea77ed53757f81b44e9d0b90d57fc70de1e0392bc6d7f9826079e169 not found: ID does not exist" containerID="6ec296b7ea77ed53757f81b44e9d0b90d57fc70de1e0392bc6d7f9826079e169" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.618241 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ec296b7ea77ed53757f81b44e9d0b90d57fc70de1e0392bc6d7f9826079e169"} err="failed to get container status \"6ec296b7ea77ed53757f81b44e9d0b90d57fc70de1e0392bc6d7f9826079e169\": rpc error: code = NotFound desc = could not find container \"6ec296b7ea77ed53757f81b44e9d0b90d57fc70de1e0392bc6d7f9826079e169\": container with ID starting with 6ec296b7ea77ed53757f81b44e9d0b90d57fc70de1e0392bc6d7f9826079e169 not found: ID does not exist" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.618263 4931 scope.go:117] "RemoveContainer" containerID="dfbdbb226a4e5164b0d9b0e549962b01d6cdf5041adecb600c5783bc04477890" Dec 01 15:50:35 crc kubenswrapper[4931]: E1201 15:50:35.618467 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfbdbb226a4e5164b0d9b0e549962b01d6cdf5041adecb600c5783bc04477890\": container with ID starting with dfbdbb226a4e5164b0d9b0e549962b01d6cdf5041adecb600c5783bc04477890 not found: ID does not exist" containerID="dfbdbb226a4e5164b0d9b0e549962b01d6cdf5041adecb600c5783bc04477890" Dec 01 15:50:35 crc kubenswrapper[4931]: I1201 15:50:35.618496 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfbdbb226a4e5164b0d9b0e549962b01d6cdf5041adecb600c5783bc04477890"} err="failed to get container status \"dfbdbb226a4e5164b0d9b0e549962b01d6cdf5041adecb600c5783bc04477890\": rpc error: code = NotFound desc = could not find container \"dfbdbb226a4e5164b0d9b0e549962b01d6cdf5041adecb600c5783bc04477890\": container with ID starting with dfbdbb226a4e5164b0d9b0e549962b01d6cdf5041adecb600c5783bc04477890 not found: ID does not exist" Dec 01 15:50:36 crc kubenswrapper[4931]: I1201 15:50:36.254196 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18746cd6-b301-4a32-8231-f198277567b3" path="/var/lib/kubelet/pods/18746cd6-b301-4a32-8231-f198277567b3/volumes" Dec 01 15:50:36 crc kubenswrapper[4931]: I1201 15:50:36.704341 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-crtqf"] Dec 01 15:50:37 crc kubenswrapper[4931]: I1201 15:50:37.543086 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-crtqf" podUID="013acb63-d2dd-440c-b71d-7489a0c704eb" containerName="registry-server" containerID="cri-o://7eb8e36e449f26e6d00e0dab78fa3e4fea0c0d80597026f287e7484abc6a5852" gracePeriod=2 Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.023118 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.205483 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013acb63-d2dd-440c-b71d-7489a0c704eb-utilities\") pod \"013acb63-d2dd-440c-b71d-7489a0c704eb\" (UID: \"013acb63-d2dd-440c-b71d-7489a0c704eb\") " Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.205514 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013acb63-d2dd-440c-b71d-7489a0c704eb-catalog-content\") pod \"013acb63-d2dd-440c-b71d-7489a0c704eb\" (UID: \"013acb63-d2dd-440c-b71d-7489a0c704eb\") " Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.205542 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qbrc\" (UniqueName: \"kubernetes.io/projected/013acb63-d2dd-440c-b71d-7489a0c704eb-kube-api-access-8qbrc\") pod \"013acb63-d2dd-440c-b71d-7489a0c704eb\" (UID: \"013acb63-d2dd-440c-b71d-7489a0c704eb\") " Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.206578 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/013acb63-d2dd-440c-b71d-7489a0c704eb-utilities" (OuterVolumeSpecName: "utilities") pod "013acb63-d2dd-440c-b71d-7489a0c704eb" (UID: "013acb63-d2dd-440c-b71d-7489a0c704eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.210825 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013acb63-d2dd-440c-b71d-7489a0c704eb-kube-api-access-8qbrc" (OuterVolumeSpecName: "kube-api-access-8qbrc") pod "013acb63-d2dd-440c-b71d-7489a0c704eb" (UID: "013acb63-d2dd-440c-b71d-7489a0c704eb"). InnerVolumeSpecName "kube-api-access-8qbrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.267046 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/013acb63-d2dd-440c-b71d-7489a0c704eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "013acb63-d2dd-440c-b71d-7489a0c704eb" (UID: "013acb63-d2dd-440c-b71d-7489a0c704eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.308362 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013acb63-d2dd-440c-b71d-7489a0c704eb-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.308470 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013acb63-d2dd-440c-b71d-7489a0c704eb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.308489 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qbrc\" (UniqueName: \"kubernetes.io/projected/013acb63-d2dd-440c-b71d-7489a0c704eb-kube-api-access-8qbrc\") on node \"crc\" DevicePath \"\"" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.553031 4931 generic.go:334] "Generic (PLEG): container finished" podID="013acb63-d2dd-440c-b71d-7489a0c704eb" containerID="7eb8e36e449f26e6d00e0dab78fa3e4fea0c0d80597026f287e7484abc6a5852" exitCode=0 Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.553077 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crtqf" event={"ID":"013acb63-d2dd-440c-b71d-7489a0c704eb","Type":"ContainerDied","Data":"7eb8e36e449f26e6d00e0dab78fa3e4fea0c0d80597026f287e7484abc6a5852"} Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.553111 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-crtqf" event={"ID":"013acb63-d2dd-440c-b71d-7489a0c704eb","Type":"ContainerDied","Data":"489faae74c18a1ab6e8b804c794faa483c4013c4912cc206bc653cbac4cfce06"} Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.553118 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-crtqf" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.553136 4931 scope.go:117] "RemoveContainer" containerID="7eb8e36e449f26e6d00e0dab78fa3e4fea0c0d80597026f287e7484abc6a5852" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.575105 4931 scope.go:117] "RemoveContainer" containerID="db34cdcf90dd698f87b00ecbbe74287817d8441f64d1032a79ee16c8beca65dd" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.588542 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-crtqf"] Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.594870 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-crtqf"] Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.597959 4931 scope.go:117] "RemoveContainer" containerID="6ad762cf91d4a7fef03270d149496b4a89b4802304469377929aee1aac333e5d" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.639414 4931 scope.go:117] "RemoveContainer" containerID="7eb8e36e449f26e6d00e0dab78fa3e4fea0c0d80597026f287e7484abc6a5852" Dec 01 15:50:38 crc kubenswrapper[4931]: E1201 15:50:38.639875 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eb8e36e449f26e6d00e0dab78fa3e4fea0c0d80597026f287e7484abc6a5852\": container with ID starting with 7eb8e36e449f26e6d00e0dab78fa3e4fea0c0d80597026f287e7484abc6a5852 not found: ID does not exist" containerID="7eb8e36e449f26e6d00e0dab78fa3e4fea0c0d80597026f287e7484abc6a5852" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.639914 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb8e36e449f26e6d00e0dab78fa3e4fea0c0d80597026f287e7484abc6a5852"} err="failed to get container status \"7eb8e36e449f26e6d00e0dab78fa3e4fea0c0d80597026f287e7484abc6a5852\": rpc error: code = NotFound desc = could not find container \"7eb8e36e449f26e6d00e0dab78fa3e4fea0c0d80597026f287e7484abc6a5852\": container with ID starting with 7eb8e36e449f26e6d00e0dab78fa3e4fea0c0d80597026f287e7484abc6a5852 not found: ID does not exist" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.639942 4931 scope.go:117] "RemoveContainer" containerID="db34cdcf90dd698f87b00ecbbe74287817d8441f64d1032a79ee16c8beca65dd" Dec 01 15:50:38 crc kubenswrapper[4931]: E1201 15:50:38.640186 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db34cdcf90dd698f87b00ecbbe74287817d8441f64d1032a79ee16c8beca65dd\": container with ID starting with db34cdcf90dd698f87b00ecbbe74287817d8441f64d1032a79ee16c8beca65dd not found: ID does not exist" containerID="db34cdcf90dd698f87b00ecbbe74287817d8441f64d1032a79ee16c8beca65dd" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.640216 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db34cdcf90dd698f87b00ecbbe74287817d8441f64d1032a79ee16c8beca65dd"} err="failed to get container status \"db34cdcf90dd698f87b00ecbbe74287817d8441f64d1032a79ee16c8beca65dd\": rpc error: code = NotFound desc = could not find container \"db34cdcf90dd698f87b00ecbbe74287817d8441f64d1032a79ee16c8beca65dd\": container with ID starting with db34cdcf90dd698f87b00ecbbe74287817d8441f64d1032a79ee16c8beca65dd not found: ID does not exist" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.640236 4931 scope.go:117] "RemoveContainer" containerID="6ad762cf91d4a7fef03270d149496b4a89b4802304469377929aee1aac333e5d" Dec 01 15:50:38 crc kubenswrapper[4931]: E1201 15:50:38.640469 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ad762cf91d4a7fef03270d149496b4a89b4802304469377929aee1aac333e5d\": container with ID starting with 6ad762cf91d4a7fef03270d149496b4a89b4802304469377929aee1aac333e5d not found: ID does not exist" containerID="6ad762cf91d4a7fef03270d149496b4a89b4802304469377929aee1aac333e5d" Dec 01 15:50:38 crc kubenswrapper[4931]: I1201 15:50:38.640489 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ad762cf91d4a7fef03270d149496b4a89b4802304469377929aee1aac333e5d"} err="failed to get container status \"6ad762cf91d4a7fef03270d149496b4a89b4802304469377929aee1aac333e5d\": rpc error: code = NotFound desc = could not find container \"6ad762cf91d4a7fef03270d149496b4a89b4802304469377929aee1aac333e5d\": container with ID starting with 6ad762cf91d4a7fef03270d149496b4a89b4802304469377929aee1aac333e5d not found: ID does not exist" Dec 01 15:50:40 crc kubenswrapper[4931]: I1201 15:50:40.266249 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="013acb63-d2dd-440c-b71d-7489a0c704eb" path="/var/lib/kubelet/pods/013acb63-d2dd-440c-b71d-7489a0c704eb/volumes" Dec 01 15:50:42 crc kubenswrapper[4931]: I1201 15:50:42.726964 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:42 crc kubenswrapper[4931]: I1201 15:50:42.797806 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:42 crc kubenswrapper[4931]: I1201 15:50:42.973208 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xcdhx"] Dec 01 15:50:44 crc kubenswrapper[4931]: I1201 15:50:44.621033 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xcdhx" podUID="ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" containerName="registry-server" containerID="cri-o://13227e7188d76204875f99b109f0b3bf550eb771ababebf362b715c9669b6ad6" gracePeriod=2 Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.095357 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.245173 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-catalog-content\") pod \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\" (UID: \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\") " Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.245703 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-utilities\") pod \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\" (UID: \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\") " Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.245831 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8hbv\" (UniqueName: \"kubernetes.io/projected/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-kube-api-access-k8hbv\") pod \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\" (UID: \"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8\") " Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.246357 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-utilities" (OuterVolumeSpecName: "utilities") pod "ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" (UID: "ad3f2fea-2b15-48ed-a8e7-7fb0198033f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.253574 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-kube-api-access-k8hbv" (OuterVolumeSpecName: "kube-api-access-k8hbv") pod "ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" (UID: "ad3f2fea-2b15-48ed-a8e7-7fb0198033f8"). InnerVolumeSpecName "kube-api-access-k8hbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.349944 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.349984 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8hbv\" (UniqueName: \"kubernetes.io/projected/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-kube-api-access-k8hbv\") on node \"crc\" DevicePath \"\"" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.374206 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" (UID: "ad3f2fea-2b15-48ed-a8e7-7fb0198033f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.451821 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.629919 4931 generic.go:334] "Generic (PLEG): container finished" podID="ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" containerID="13227e7188d76204875f99b109f0b3bf550eb771ababebf362b715c9669b6ad6" exitCode=0 Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.629964 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcdhx" event={"ID":"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8","Type":"ContainerDied","Data":"13227e7188d76204875f99b109f0b3bf550eb771ababebf362b715c9669b6ad6"} Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.629992 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcdhx" event={"ID":"ad3f2fea-2b15-48ed-a8e7-7fb0198033f8","Type":"ContainerDied","Data":"079ee3ede28f187315ae06bce359ab376e3144b54047c4441764bde9fc23091f"} Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.630011 4931 scope.go:117] "RemoveContainer" containerID="13227e7188d76204875f99b109f0b3bf550eb771ababebf362b715c9669b6ad6" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.630052 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcdhx" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.655260 4931 scope.go:117] "RemoveContainer" containerID="0586b1cbd830f994ced3d1ef0a70bd38586e984e9b2da3638990b3e906ef2c3e" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.686502 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xcdhx"] Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.689170 4931 scope.go:117] "RemoveContainer" containerID="11324dc4848c8a9b56c189108f7d6582bb9124b8f315424e2b1a400a5ff89820" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.695338 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xcdhx"] Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.745475 4931 scope.go:117] "RemoveContainer" containerID="13227e7188d76204875f99b109f0b3bf550eb771ababebf362b715c9669b6ad6" Dec 01 15:50:45 crc kubenswrapper[4931]: E1201 15:50:45.745966 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13227e7188d76204875f99b109f0b3bf550eb771ababebf362b715c9669b6ad6\": container with ID starting with 13227e7188d76204875f99b109f0b3bf550eb771ababebf362b715c9669b6ad6 not found: ID does not exist" containerID="13227e7188d76204875f99b109f0b3bf550eb771ababebf362b715c9669b6ad6" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.746012 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13227e7188d76204875f99b109f0b3bf550eb771ababebf362b715c9669b6ad6"} err="failed to get container status \"13227e7188d76204875f99b109f0b3bf550eb771ababebf362b715c9669b6ad6\": rpc error: code = NotFound desc = could not find container \"13227e7188d76204875f99b109f0b3bf550eb771ababebf362b715c9669b6ad6\": container with ID starting with 13227e7188d76204875f99b109f0b3bf550eb771ababebf362b715c9669b6ad6 not found: ID does not exist" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.746039 4931 scope.go:117] "RemoveContainer" containerID="0586b1cbd830f994ced3d1ef0a70bd38586e984e9b2da3638990b3e906ef2c3e" Dec 01 15:50:45 crc kubenswrapper[4931]: E1201 15:50:45.746279 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0586b1cbd830f994ced3d1ef0a70bd38586e984e9b2da3638990b3e906ef2c3e\": container with ID starting with 0586b1cbd830f994ced3d1ef0a70bd38586e984e9b2da3638990b3e906ef2c3e not found: ID does not exist" containerID="0586b1cbd830f994ced3d1ef0a70bd38586e984e9b2da3638990b3e906ef2c3e" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.746304 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0586b1cbd830f994ced3d1ef0a70bd38586e984e9b2da3638990b3e906ef2c3e"} err="failed to get container status \"0586b1cbd830f994ced3d1ef0a70bd38586e984e9b2da3638990b3e906ef2c3e\": rpc error: code = NotFound desc = could not find container \"0586b1cbd830f994ced3d1ef0a70bd38586e984e9b2da3638990b3e906ef2c3e\": container with ID starting with 0586b1cbd830f994ced3d1ef0a70bd38586e984e9b2da3638990b3e906ef2c3e not found: ID does not exist" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.746319 4931 scope.go:117] "RemoveContainer" containerID="11324dc4848c8a9b56c189108f7d6582bb9124b8f315424e2b1a400a5ff89820" Dec 01 15:50:45 crc kubenswrapper[4931]: E1201 15:50:45.746546 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11324dc4848c8a9b56c189108f7d6582bb9124b8f315424e2b1a400a5ff89820\": container with ID starting with 11324dc4848c8a9b56c189108f7d6582bb9124b8f315424e2b1a400a5ff89820 not found: ID does not exist" containerID="11324dc4848c8a9b56c189108f7d6582bb9124b8f315424e2b1a400a5ff89820" Dec 01 15:50:45 crc kubenswrapper[4931]: I1201 15:50:45.746569 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11324dc4848c8a9b56c189108f7d6582bb9124b8f315424e2b1a400a5ff89820"} err="failed to get container status \"11324dc4848c8a9b56c189108f7d6582bb9124b8f315424e2b1a400a5ff89820\": rpc error: code = NotFound desc = could not find container \"11324dc4848c8a9b56c189108f7d6582bb9124b8f315424e2b1a400a5ff89820\": container with ID starting with 11324dc4848c8a9b56c189108f7d6582bb9124b8f315424e2b1a400a5ff89820 not found: ID does not exist" Dec 01 15:50:46 crc kubenswrapper[4931]: I1201 15:50:46.254201 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" path="/var/lib/kubelet/pods/ad3f2fea-2b15-48ed-a8e7-7fb0198033f8/volumes" Dec 01 15:51:49 crc kubenswrapper[4931]: I1201 15:51:49.872250 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:51:49 crc kubenswrapper[4931]: I1201 15:51:49.872850 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:52:19 crc kubenswrapper[4931]: I1201 15:52:19.871870 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:52:19 crc kubenswrapper[4931]: I1201 15:52:19.872472 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:52:49 crc kubenswrapper[4931]: I1201 15:52:49.872153 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:52:49 crc kubenswrapper[4931]: I1201 15:52:49.872781 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:52:49 crc kubenswrapper[4931]: I1201 15:52:49.872834 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 15:52:49 crc kubenswrapper[4931]: I1201 15:52:49.873627 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 15:52:49 crc kubenswrapper[4931]: I1201 15:52:49.873720 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" gracePeriod=600 Dec 01 15:52:50 crc kubenswrapper[4931]: E1201 15:52:50.001334 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:52:50 crc kubenswrapper[4931]: I1201 15:52:50.835705 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" exitCode=0 Dec 01 15:52:50 crc kubenswrapper[4931]: I1201 15:52:50.835766 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46"} Dec 01 15:52:50 crc kubenswrapper[4931]: I1201 15:52:50.835876 4931 scope.go:117] "RemoveContainer" containerID="640e688b1925e1462bde9a1b083909cc46a8e792843625e3db9aa4c1416d3fa2" Dec 01 15:52:50 crc kubenswrapper[4931]: I1201 15:52:50.836496 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:52:50 crc kubenswrapper[4931]: E1201 15:52:50.836861 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:53:02 crc kubenswrapper[4931]: I1201 15:53:02.243959 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:53:02 crc kubenswrapper[4931]: E1201 15:53:02.244946 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:53:13 crc kubenswrapper[4931]: I1201 15:53:13.242106 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:53:13 crc kubenswrapper[4931]: E1201 15:53:13.242914 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:53:25 crc kubenswrapper[4931]: I1201 15:53:25.241229 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:53:25 crc kubenswrapper[4931]: E1201 15:53:25.241993 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:53:39 crc kubenswrapper[4931]: I1201 15:53:39.242977 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:53:39 crc kubenswrapper[4931]: E1201 15:53:39.244495 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:53:52 crc kubenswrapper[4931]: I1201 15:53:52.242496 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:53:52 crc kubenswrapper[4931]: E1201 15:53:52.243342 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:54:07 crc kubenswrapper[4931]: I1201 15:54:07.242527 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:54:07 crc kubenswrapper[4931]: E1201 15:54:07.243793 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:54:18 crc kubenswrapper[4931]: I1201 15:54:18.241723 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:54:18 crc kubenswrapper[4931]: E1201 15:54:18.242612 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:54:29 crc kubenswrapper[4931]: I1201 15:54:29.242969 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:54:29 crc kubenswrapper[4931]: E1201 15:54:29.244297 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:54:42 crc kubenswrapper[4931]: I1201 15:54:42.242646 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:54:42 crc kubenswrapper[4931]: E1201 15:54:42.243848 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:54:54 crc kubenswrapper[4931]: I1201 15:54:54.275193 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:54:54 crc kubenswrapper[4931]: E1201 15:54:54.277882 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:55:09 crc kubenswrapper[4931]: I1201 15:55:09.241771 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:55:09 crc kubenswrapper[4931]: E1201 15:55:09.242904 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:55:20 crc kubenswrapper[4931]: I1201 15:55:20.242230 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:55:20 crc kubenswrapper[4931]: E1201 15:55:20.243598 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:55:35 crc kubenswrapper[4931]: I1201 15:55:35.241544 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:55:35 crc kubenswrapper[4931]: E1201 15:55:35.242178 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:55:46 crc kubenswrapper[4931]: I1201 15:55:46.241882 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:55:46 crc kubenswrapper[4931]: E1201 15:55:46.242715 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:55:58 crc kubenswrapper[4931]: I1201 15:55:58.242495 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:55:58 crc kubenswrapper[4931]: E1201 15:55:58.243570 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:56:13 crc kubenswrapper[4931]: I1201 15:56:13.240956 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:56:13 crc kubenswrapper[4931]: E1201 15:56:13.241733 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:56:28 crc kubenswrapper[4931]: I1201 15:56:28.241637 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:56:28 crc kubenswrapper[4931]: E1201 15:56:28.242567 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:56:39 crc kubenswrapper[4931]: I1201 15:56:39.241012 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:56:39 crc kubenswrapper[4931]: E1201 15:56:39.241665 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:56:53 crc kubenswrapper[4931]: I1201 15:56:53.241983 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:56:53 crc kubenswrapper[4931]: E1201 15:56:53.242906 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:57:07 crc kubenswrapper[4931]: I1201 15:57:07.242446 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:57:07 crc kubenswrapper[4931]: E1201 15:57:07.243201 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:57:21 crc kubenswrapper[4931]: I1201 15:57:21.241174 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:57:21 crc kubenswrapper[4931]: E1201 15:57:21.242122 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:57:34 crc kubenswrapper[4931]: I1201 15:57:34.249123 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:57:34 crc kubenswrapper[4931]: E1201 15:57:34.250069 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:57:48 crc kubenswrapper[4931]: I1201 15:57:48.242750 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:57:48 crc kubenswrapper[4931]: E1201 15:57:48.243587 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 15:58:02 crc kubenswrapper[4931]: I1201 15:58:02.243188 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 15:58:02 crc kubenswrapper[4931]: I1201 15:58:02.961108 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"d7a9470ce8c868229b979f2cb9e7efbcbadfca1c6e29e938ab705e714f043b38"} Dec 01 15:59:56 crc kubenswrapper[4931]: I1201 15:59:56.035846 4931 generic.go:334] "Generic (PLEG): container finished" podID="e384c534-76cd-4296-9318-aaf007e87661" containerID="989bb3bf58f18d976d1b7fa55f857735e91cbd076adbe6baa653e74c28d5e877" exitCode=0 Dec 01 15:59:56 crc kubenswrapper[4931]: I1201 15:59:56.035950 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e384c534-76cd-4296-9318-aaf007e87661","Type":"ContainerDied","Data":"989bb3bf58f18d976d1b7fa55f857735e91cbd076adbe6baa653e74c28d5e877"} Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.512526 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.629352 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e384c534-76cd-4296-9318-aaf007e87661-openstack-config\") pod \"e384c534-76cd-4296-9318-aaf007e87661\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.629474 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-openstack-config-secret\") pod \"e384c534-76cd-4296-9318-aaf007e87661\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.629557 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e384c534-76cd-4296-9318-aaf007e87661-test-operator-ephemeral-temporary\") pod \"e384c534-76cd-4296-9318-aaf007e87661\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.629582 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs2fc\" (UniqueName: \"kubernetes.io/projected/e384c534-76cd-4296-9318-aaf007e87661-kube-api-access-hs2fc\") pod \"e384c534-76cd-4296-9318-aaf007e87661\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.629645 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"e384c534-76cd-4296-9318-aaf007e87661\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.629729 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-ssh-key\") pod \"e384c534-76cd-4296-9318-aaf007e87661\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.629746 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-ca-certs\") pod \"e384c534-76cd-4296-9318-aaf007e87661\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.629813 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e384c534-76cd-4296-9318-aaf007e87661-config-data\") pod \"e384c534-76cd-4296-9318-aaf007e87661\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.629828 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e384c534-76cd-4296-9318-aaf007e87661-test-operator-ephemeral-workdir\") pod \"e384c534-76cd-4296-9318-aaf007e87661\" (UID: \"e384c534-76cd-4296-9318-aaf007e87661\") " Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.630808 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e384c534-76cd-4296-9318-aaf007e87661-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "e384c534-76cd-4296-9318-aaf007e87661" (UID: "e384c534-76cd-4296-9318-aaf007e87661"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.631076 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e384c534-76cd-4296-9318-aaf007e87661-config-data" (OuterVolumeSpecName: "config-data") pod "e384c534-76cd-4296-9318-aaf007e87661" (UID: "e384c534-76cd-4296-9318-aaf007e87661"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.650829 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e384c534-76cd-4296-9318-aaf007e87661-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "e384c534-76cd-4296-9318-aaf007e87661" (UID: "e384c534-76cd-4296-9318-aaf007e87661"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.651012 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "e384c534-76cd-4296-9318-aaf007e87661" (UID: "e384c534-76cd-4296-9318-aaf007e87661"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.664638 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e384c534-76cd-4296-9318-aaf007e87661-kube-api-access-hs2fc" (OuterVolumeSpecName: "kube-api-access-hs2fc") pod "e384c534-76cd-4296-9318-aaf007e87661" (UID: "e384c534-76cd-4296-9318-aaf007e87661"). InnerVolumeSpecName "kube-api-access-hs2fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.698604 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e384c534-76cd-4296-9318-aaf007e87661" (UID: "e384c534-76cd-4296-9318-aaf007e87661"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.721531 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "e384c534-76cd-4296-9318-aaf007e87661" (UID: "e384c534-76cd-4296-9318-aaf007e87661"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.734678 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e384c534-76cd-4296-9318-aaf007e87661-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.734710 4931 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e384c534-76cd-4296-9318-aaf007e87661-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.734721 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.734731 4931 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e384c534-76cd-4296-9318-aaf007e87661-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.734742 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs2fc\" (UniqueName: \"kubernetes.io/projected/e384c534-76cd-4296-9318-aaf007e87661-kube-api-access-hs2fc\") on node \"crc\" DevicePath \"\"" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.734762 4931 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.734772 4931 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.743694 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e384c534-76cd-4296-9318-aaf007e87661" (UID: "e384c534-76cd-4296-9318-aaf007e87661"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.765996 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e384c534-76cd-4296-9318-aaf007e87661-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e384c534-76cd-4296-9318-aaf007e87661" (UID: "e384c534-76cd-4296-9318-aaf007e87661"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.768569 4931 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.836134 4931 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.836170 4931 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e384c534-76cd-4296-9318-aaf007e87661-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 15:59:57 crc kubenswrapper[4931]: I1201 15:59:57.836180 4931 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e384c534-76cd-4296-9318-aaf007e87661-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 01 15:59:58 crc kubenswrapper[4931]: I1201 15:59:58.057531 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e384c534-76cd-4296-9318-aaf007e87661","Type":"ContainerDied","Data":"f4720224af89e27fed09cb6d1d10ba4090016053cb0b6964586884d28160a677"} Dec 01 15:59:58 crc kubenswrapper[4931]: I1201 15:59:58.057565 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4720224af89e27fed09cb6d1d10ba4090016053cb0b6964586884d28160a677" Dec 01 15:59:58 crc kubenswrapper[4931]: I1201 15:59:58.057576 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.165789 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48"] Dec 01 16:00:00 crc kubenswrapper[4931]: E1201 16:00:00.168229 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18746cd6-b301-4a32-8231-f198277567b3" containerName="registry-server" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.168285 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="18746cd6-b301-4a32-8231-f198277567b3" containerName="registry-server" Dec 01 16:00:00 crc kubenswrapper[4931]: E1201 16:00:00.168310 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18746cd6-b301-4a32-8231-f198277567b3" containerName="extract-utilities" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.168323 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="18746cd6-b301-4a32-8231-f198277567b3" containerName="extract-utilities" Dec 01 16:00:00 crc kubenswrapper[4931]: E1201 16:00:00.168365 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18746cd6-b301-4a32-8231-f198277567b3" containerName="extract-content" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.168378 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="18746cd6-b301-4a32-8231-f198277567b3" containerName="extract-content" Dec 01 16:00:00 crc kubenswrapper[4931]: E1201 16:00:00.168439 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" containerName="registry-server" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.168466 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" containerName="registry-server" Dec 01 16:00:00 crc kubenswrapper[4931]: E1201 16:00:00.168493 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" containerName="extract-utilities" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.168504 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" containerName="extract-utilities" Dec 01 16:00:00 crc kubenswrapper[4931]: E1201 16:00:00.168527 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013acb63-d2dd-440c-b71d-7489a0c704eb" containerName="extract-utilities" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.168569 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="013acb63-d2dd-440c-b71d-7489a0c704eb" containerName="extract-utilities" Dec 01 16:00:00 crc kubenswrapper[4931]: E1201 16:00:00.168593 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013acb63-d2dd-440c-b71d-7489a0c704eb" containerName="registry-server" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.168607 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="013acb63-d2dd-440c-b71d-7489a0c704eb" containerName="registry-server" Dec 01 16:00:00 crc kubenswrapper[4931]: E1201 16:00:00.168628 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e384c534-76cd-4296-9318-aaf007e87661" containerName="tempest-tests-tempest-tests-runner" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.168640 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="e384c534-76cd-4296-9318-aaf007e87661" containerName="tempest-tests-tempest-tests-runner" Dec 01 16:00:00 crc kubenswrapper[4931]: E1201 16:00:00.168669 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013acb63-d2dd-440c-b71d-7489a0c704eb" containerName="extract-content" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.168682 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="013acb63-d2dd-440c-b71d-7489a0c704eb" containerName="extract-content" Dec 01 16:00:00 crc kubenswrapper[4931]: E1201 16:00:00.168709 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" containerName="extract-content" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.168721 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" containerName="extract-content" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.169054 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3f2fea-2b15-48ed-a8e7-7fb0198033f8" containerName="registry-server" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.169089 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="013acb63-d2dd-440c-b71d-7489a0c704eb" containerName="registry-server" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.169116 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="e384c534-76cd-4296-9318-aaf007e87661" containerName="tempest-tests-tempest-tests-runner" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.169150 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="18746cd6-b301-4a32-8231-f198277567b3" containerName="registry-server" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.170223 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.175422 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.180170 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.181094 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48"] Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.329875 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94a9fe0e-95b1-47ed-83bd-916c9f219907-config-volume\") pod \"collect-profiles-29410080-vxr48\" (UID: \"94a9fe0e-95b1-47ed-83bd-916c9f219907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.330020 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmlxs\" (UniqueName: \"kubernetes.io/projected/94a9fe0e-95b1-47ed-83bd-916c9f219907-kube-api-access-fmlxs\") pod \"collect-profiles-29410080-vxr48\" (UID: \"94a9fe0e-95b1-47ed-83bd-916c9f219907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.330234 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94a9fe0e-95b1-47ed-83bd-916c9f219907-secret-volume\") pod \"collect-profiles-29410080-vxr48\" (UID: \"94a9fe0e-95b1-47ed-83bd-916c9f219907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.431492 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94a9fe0e-95b1-47ed-83bd-916c9f219907-config-volume\") pod \"collect-profiles-29410080-vxr48\" (UID: \"94a9fe0e-95b1-47ed-83bd-916c9f219907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.431614 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmlxs\" (UniqueName: \"kubernetes.io/projected/94a9fe0e-95b1-47ed-83bd-916c9f219907-kube-api-access-fmlxs\") pod \"collect-profiles-29410080-vxr48\" (UID: \"94a9fe0e-95b1-47ed-83bd-916c9f219907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.431681 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94a9fe0e-95b1-47ed-83bd-916c9f219907-secret-volume\") pod \"collect-profiles-29410080-vxr48\" (UID: \"94a9fe0e-95b1-47ed-83bd-916c9f219907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.432450 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94a9fe0e-95b1-47ed-83bd-916c9f219907-config-volume\") pod \"collect-profiles-29410080-vxr48\" (UID: \"94a9fe0e-95b1-47ed-83bd-916c9f219907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.440487 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94a9fe0e-95b1-47ed-83bd-916c9f219907-secret-volume\") pod \"collect-profiles-29410080-vxr48\" (UID: \"94a9fe0e-95b1-47ed-83bd-916c9f219907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.452691 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmlxs\" (UniqueName: \"kubernetes.io/projected/94a9fe0e-95b1-47ed-83bd-916c9f219907-kube-api-access-fmlxs\") pod \"collect-profiles-29410080-vxr48\" (UID: \"94a9fe0e-95b1-47ed-83bd-916c9f219907\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.503722 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" Dec 01 16:00:00 crc kubenswrapper[4931]: I1201 16:00:00.970835 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48"] Dec 01 16:00:01 crc kubenswrapper[4931]: I1201 16:00:01.090711 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" event={"ID":"94a9fe0e-95b1-47ed-83bd-916c9f219907","Type":"ContainerStarted","Data":"5991fc878f41f3462ec4c97d554bb078440f1d48432bdb6e6bcd210c4dd30064"} Dec 01 16:00:02 crc kubenswrapper[4931]: I1201 16:00:02.104061 4931 generic.go:334] "Generic (PLEG): container finished" podID="94a9fe0e-95b1-47ed-83bd-916c9f219907" containerID="624ead17ca8e4e74a9a48353bb7b34e9516894bc67d7667cfa3f0c292b0f3eab" exitCode=0 Dec 01 16:00:02 crc kubenswrapper[4931]: I1201 16:00:02.104134 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" event={"ID":"94a9fe0e-95b1-47ed-83bd-916c9f219907","Type":"ContainerDied","Data":"624ead17ca8e4e74a9a48353bb7b34e9516894bc67d7667cfa3f0c292b0f3eab"} Dec 01 16:00:03 crc kubenswrapper[4931]: I1201 16:00:03.436520 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" Dec 01 16:00:03 crc kubenswrapper[4931]: I1201 16:00:03.597910 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94a9fe0e-95b1-47ed-83bd-916c9f219907-config-volume\") pod \"94a9fe0e-95b1-47ed-83bd-916c9f219907\" (UID: \"94a9fe0e-95b1-47ed-83bd-916c9f219907\") " Dec 01 16:00:03 crc kubenswrapper[4931]: I1201 16:00:03.598103 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmlxs\" (UniqueName: \"kubernetes.io/projected/94a9fe0e-95b1-47ed-83bd-916c9f219907-kube-api-access-fmlxs\") pod \"94a9fe0e-95b1-47ed-83bd-916c9f219907\" (UID: \"94a9fe0e-95b1-47ed-83bd-916c9f219907\") " Dec 01 16:00:03 crc kubenswrapper[4931]: I1201 16:00:03.598200 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94a9fe0e-95b1-47ed-83bd-916c9f219907-secret-volume\") pod \"94a9fe0e-95b1-47ed-83bd-916c9f219907\" (UID: \"94a9fe0e-95b1-47ed-83bd-916c9f219907\") " Dec 01 16:00:03 crc kubenswrapper[4931]: I1201 16:00:03.599006 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a9fe0e-95b1-47ed-83bd-916c9f219907-config-volume" (OuterVolumeSpecName: "config-volume") pod "94a9fe0e-95b1-47ed-83bd-916c9f219907" (UID: "94a9fe0e-95b1-47ed-83bd-916c9f219907"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 16:00:03 crc kubenswrapper[4931]: I1201 16:00:03.604221 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a9fe0e-95b1-47ed-83bd-916c9f219907-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "94a9fe0e-95b1-47ed-83bd-916c9f219907" (UID: "94a9fe0e-95b1-47ed-83bd-916c9f219907"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 16:00:03 crc kubenswrapper[4931]: I1201 16:00:03.604829 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a9fe0e-95b1-47ed-83bd-916c9f219907-kube-api-access-fmlxs" (OuterVolumeSpecName: "kube-api-access-fmlxs") pod "94a9fe0e-95b1-47ed-83bd-916c9f219907" (UID: "94a9fe0e-95b1-47ed-83bd-916c9f219907"). InnerVolumeSpecName "kube-api-access-fmlxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:00:03 crc kubenswrapper[4931]: I1201 16:00:03.700168 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94a9fe0e-95b1-47ed-83bd-916c9f219907-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 16:00:03 crc kubenswrapper[4931]: I1201 16:00:03.700202 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmlxs\" (UniqueName: \"kubernetes.io/projected/94a9fe0e-95b1-47ed-83bd-916c9f219907-kube-api-access-fmlxs\") on node \"crc\" DevicePath \"\"" Dec 01 16:00:03 crc kubenswrapper[4931]: I1201 16:00:03.700214 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94a9fe0e-95b1-47ed-83bd-916c9f219907-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.049772 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 16:00:04 crc kubenswrapper[4931]: E1201 16:00:04.055700 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a9fe0e-95b1-47ed-83bd-916c9f219907" containerName="collect-profiles" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.055736 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a9fe0e-95b1-47ed-83bd-916c9f219907" containerName="collect-profiles" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.056045 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a9fe0e-95b1-47ed-83bd-916c9f219907" containerName="collect-profiles" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.056769 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.058966 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-k2668" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.060316 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.123716 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" event={"ID":"94a9fe0e-95b1-47ed-83bd-916c9f219907","Type":"ContainerDied","Data":"5991fc878f41f3462ec4c97d554bb078440f1d48432bdb6e6bcd210c4dd30064"} Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.123769 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410080-vxr48" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.123779 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5991fc878f41f3462ec4c97d554bb078440f1d48432bdb6e6bcd210c4dd30064" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.209784 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b0a9c963-758f-4d4d-a9c0-e148a5733bf9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.210019 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcr7j\" (UniqueName: \"kubernetes.io/projected/b0a9c963-758f-4d4d-a9c0-e148a5733bf9-kube-api-access-bcr7j\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b0a9c963-758f-4d4d-a9c0-e148a5733bf9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.312046 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcr7j\" (UniqueName: \"kubernetes.io/projected/b0a9c963-758f-4d4d-a9c0-e148a5733bf9-kube-api-access-bcr7j\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b0a9c963-758f-4d4d-a9c0-e148a5733bf9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.312191 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b0a9c963-758f-4d4d-a9c0-e148a5733bf9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.312650 4931 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b0a9c963-758f-4d4d-a9c0-e148a5733bf9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.327633 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcr7j\" (UniqueName: \"kubernetes.io/projected/b0a9c963-758f-4d4d-a9c0-e148a5733bf9-kube-api-access-bcr7j\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b0a9c963-758f-4d4d-a9c0-e148a5733bf9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.338609 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"b0a9c963-758f-4d4d-a9c0-e148a5733bf9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.375788 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.538235 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk"] Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.546750 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410035-6dclk"] Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.890677 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 16:00:04 crc kubenswrapper[4931]: W1201 16:00:04.900055 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0a9c963_758f_4d4d_a9c0_e148a5733bf9.slice/crio-6d804815050060e54afb83e7253f38642f9f5a121557848a6c559ea15828f102 WatchSource:0}: Error finding container 6d804815050060e54afb83e7253f38642f9f5a121557848a6c559ea15828f102: Status 404 returned error can't find the container with id 6d804815050060e54afb83e7253f38642f9f5a121557848a6c559ea15828f102 Dec 01 16:00:04 crc kubenswrapper[4931]: I1201 16:00:04.901979 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 16:00:05 crc kubenswrapper[4931]: I1201 16:00:05.133024 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b0a9c963-758f-4d4d-a9c0-e148a5733bf9","Type":"ContainerStarted","Data":"6d804815050060e54afb83e7253f38642f9f5a121557848a6c559ea15828f102"} Dec 01 16:00:06 crc kubenswrapper[4931]: I1201 16:00:06.252652 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="269c4450-aefe-45b5-b753-0be7e90e8af6" path="/var/lib/kubelet/pods/269c4450-aefe-45b5-b753-0be7e90e8af6/volumes" Dec 01 16:00:09 crc kubenswrapper[4931]: I1201 16:00:09.184685 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"b0a9c963-758f-4d4d-a9c0-e148a5733bf9","Type":"ContainerStarted","Data":"70b8cb723d42a3b8ce2a082975567c6eff793838a59e29bde650a19a8b061a3f"} Dec 01 16:00:09 crc kubenswrapper[4931]: I1201 16:00:09.208543 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.744931944 podStartE2EDuration="5.208526362s" podCreationTimestamp="2025-12-01 16:00:04 +0000 UTC" firstStartedPulling="2025-12-01 16:00:04.901769533 +0000 UTC m=+3551.327643200" lastFinishedPulling="2025-12-01 16:00:08.365363941 +0000 UTC m=+3554.791237618" observedRunningTime="2025-12-01 16:00:09.207304369 +0000 UTC m=+3555.633178116" watchObservedRunningTime="2025-12-01 16:00:09.208526362 +0000 UTC m=+3555.634400029" Dec 01 16:00:19 crc kubenswrapper[4931]: I1201 16:00:19.872841 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 16:00:19 crc kubenswrapper[4931]: I1201 16:00:19.873681 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 16:00:33 crc kubenswrapper[4931]: I1201 16:00:33.578421 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p88bd/must-gather-84mmd"] Dec 01 16:00:33 crc kubenswrapper[4931]: I1201 16:00:33.580859 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/must-gather-84mmd" Dec 01 16:00:33 crc kubenswrapper[4931]: I1201 16:00:33.583141 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-p88bd"/"default-dockercfg-7qtc4" Dec 01 16:00:33 crc kubenswrapper[4931]: I1201 16:00:33.583195 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-p88bd"/"openshift-service-ca.crt" Dec 01 16:00:33 crc kubenswrapper[4931]: I1201 16:00:33.592926 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-p88bd"/"kube-root-ca.crt" Dec 01 16:00:33 crc kubenswrapper[4931]: I1201 16:00:33.597051 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p88bd/must-gather-84mmd"] Dec 01 16:00:33 crc kubenswrapper[4931]: I1201 16:00:33.670207 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73286ae8-c068-48b6-ac89-8f5807eb0e54-must-gather-output\") pod \"must-gather-84mmd\" (UID: \"73286ae8-c068-48b6-ac89-8f5807eb0e54\") " pod="openshift-must-gather-p88bd/must-gather-84mmd" Dec 01 16:00:33 crc kubenswrapper[4931]: I1201 16:00:33.670285 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww4x9\" (UniqueName: \"kubernetes.io/projected/73286ae8-c068-48b6-ac89-8f5807eb0e54-kube-api-access-ww4x9\") pod \"must-gather-84mmd\" (UID: \"73286ae8-c068-48b6-ac89-8f5807eb0e54\") " pod="openshift-must-gather-p88bd/must-gather-84mmd" Dec 01 16:00:33 crc kubenswrapper[4931]: I1201 16:00:33.771635 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73286ae8-c068-48b6-ac89-8f5807eb0e54-must-gather-output\") pod \"must-gather-84mmd\" (UID: \"73286ae8-c068-48b6-ac89-8f5807eb0e54\") " pod="openshift-must-gather-p88bd/must-gather-84mmd" Dec 01 16:00:33 crc kubenswrapper[4931]: I1201 16:00:33.771738 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww4x9\" (UniqueName: \"kubernetes.io/projected/73286ae8-c068-48b6-ac89-8f5807eb0e54-kube-api-access-ww4x9\") pod \"must-gather-84mmd\" (UID: \"73286ae8-c068-48b6-ac89-8f5807eb0e54\") " pod="openshift-must-gather-p88bd/must-gather-84mmd" Dec 01 16:00:33 crc kubenswrapper[4931]: I1201 16:00:33.772235 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73286ae8-c068-48b6-ac89-8f5807eb0e54-must-gather-output\") pod \"must-gather-84mmd\" (UID: \"73286ae8-c068-48b6-ac89-8f5807eb0e54\") " pod="openshift-must-gather-p88bd/must-gather-84mmd" Dec 01 16:00:33 crc kubenswrapper[4931]: I1201 16:00:33.792406 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww4x9\" (UniqueName: \"kubernetes.io/projected/73286ae8-c068-48b6-ac89-8f5807eb0e54-kube-api-access-ww4x9\") pod \"must-gather-84mmd\" (UID: \"73286ae8-c068-48b6-ac89-8f5807eb0e54\") " pod="openshift-must-gather-p88bd/must-gather-84mmd" Dec 01 16:00:33 crc kubenswrapper[4931]: I1201 16:00:33.901415 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/must-gather-84mmd" Dec 01 16:00:34 crc kubenswrapper[4931]: I1201 16:00:34.360234 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p88bd/must-gather-84mmd"] Dec 01 16:00:34 crc kubenswrapper[4931]: W1201 16:00:34.360546 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73286ae8_c068_48b6_ac89_8f5807eb0e54.slice/crio-a8b3d9a96eec1d5e86221d03817770df2b3d7fea0957c4c2c128008eee3d9890 WatchSource:0}: Error finding container a8b3d9a96eec1d5e86221d03817770df2b3d7fea0957c4c2c128008eee3d9890: Status 404 returned error can't find the container with id a8b3d9a96eec1d5e86221d03817770df2b3d7fea0957c4c2c128008eee3d9890 Dec 01 16:00:34 crc kubenswrapper[4931]: I1201 16:00:34.423722 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p88bd/must-gather-84mmd" event={"ID":"73286ae8-c068-48b6-ac89-8f5807eb0e54","Type":"ContainerStarted","Data":"a8b3d9a96eec1d5e86221d03817770df2b3d7fea0957c4c2c128008eee3d9890"} Dec 01 16:00:38 crc kubenswrapper[4931]: I1201 16:00:38.459265 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p88bd/must-gather-84mmd" event={"ID":"73286ae8-c068-48b6-ac89-8f5807eb0e54","Type":"ContainerStarted","Data":"5c806437690342fcf8b5d00b937642795f07ce5585284606f50affb8ce833f66"} Dec 01 16:00:38 crc kubenswrapper[4931]: I1201 16:00:38.697708 4931 scope.go:117] "RemoveContainer" containerID="bd6c263e7e46de183945e0ae47b15f6382e2d85efa702a709ac3aa199e922967" Dec 01 16:00:39 crc kubenswrapper[4931]: I1201 16:00:39.467711 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p88bd/must-gather-84mmd" event={"ID":"73286ae8-c068-48b6-ac89-8f5807eb0e54","Type":"ContainerStarted","Data":"caffc891a42f82cd2b0b08f92a8ed83901eb1e3be9c8cb528e52019581b4086a"} Dec 01 16:00:39 crc kubenswrapper[4931]: I1201 16:00:39.484653 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p88bd/must-gather-84mmd" podStartSLOduration=2.6140926049999997 podStartE2EDuration="6.484634864s" podCreationTimestamp="2025-12-01 16:00:33 +0000 UTC" firstStartedPulling="2025-12-01 16:00:34.362414532 +0000 UTC m=+3580.788288199" lastFinishedPulling="2025-12-01 16:00:38.232956791 +0000 UTC m=+3584.658830458" observedRunningTime="2025-12-01 16:00:39.480132592 +0000 UTC m=+3585.906006269" watchObservedRunningTime="2025-12-01 16:00:39.484634864 +0000 UTC m=+3585.910508521" Dec 01 16:00:41 crc kubenswrapper[4931]: I1201 16:00:41.853709 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p88bd/crc-debug-d8wt9"] Dec 01 16:00:41 crc kubenswrapper[4931]: I1201 16:00:41.855197 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/crc-debug-d8wt9" Dec 01 16:00:42 crc kubenswrapper[4931]: I1201 16:00:42.031642 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29d401e0-9a64-4f38-9ae1-00a8c55d9802-host\") pod \"crc-debug-d8wt9\" (UID: \"29d401e0-9a64-4f38-9ae1-00a8c55d9802\") " pod="openshift-must-gather-p88bd/crc-debug-d8wt9" Dec 01 16:00:42 crc kubenswrapper[4931]: I1201 16:00:42.031920 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pj6q\" (UniqueName: \"kubernetes.io/projected/29d401e0-9a64-4f38-9ae1-00a8c55d9802-kube-api-access-7pj6q\") pod \"crc-debug-d8wt9\" (UID: \"29d401e0-9a64-4f38-9ae1-00a8c55d9802\") " pod="openshift-must-gather-p88bd/crc-debug-d8wt9" Dec 01 16:00:42 crc kubenswrapper[4931]: I1201 16:00:42.134665 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29d401e0-9a64-4f38-9ae1-00a8c55d9802-host\") pod \"crc-debug-d8wt9\" (UID: \"29d401e0-9a64-4f38-9ae1-00a8c55d9802\") " pod="openshift-must-gather-p88bd/crc-debug-d8wt9" Dec 01 16:00:42 crc kubenswrapper[4931]: I1201 16:00:42.134755 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pj6q\" (UniqueName: \"kubernetes.io/projected/29d401e0-9a64-4f38-9ae1-00a8c55d9802-kube-api-access-7pj6q\") pod \"crc-debug-d8wt9\" (UID: \"29d401e0-9a64-4f38-9ae1-00a8c55d9802\") " pod="openshift-must-gather-p88bd/crc-debug-d8wt9" Dec 01 16:00:42 crc kubenswrapper[4931]: I1201 16:00:42.134761 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29d401e0-9a64-4f38-9ae1-00a8c55d9802-host\") pod \"crc-debug-d8wt9\" (UID: \"29d401e0-9a64-4f38-9ae1-00a8c55d9802\") " pod="openshift-must-gather-p88bd/crc-debug-d8wt9" Dec 01 16:00:42 crc kubenswrapper[4931]: I1201 16:00:42.158872 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pj6q\" (UniqueName: \"kubernetes.io/projected/29d401e0-9a64-4f38-9ae1-00a8c55d9802-kube-api-access-7pj6q\") pod \"crc-debug-d8wt9\" (UID: \"29d401e0-9a64-4f38-9ae1-00a8c55d9802\") " pod="openshift-must-gather-p88bd/crc-debug-d8wt9" Dec 01 16:00:42 crc kubenswrapper[4931]: I1201 16:00:42.171864 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/crc-debug-d8wt9" Dec 01 16:00:42 crc kubenswrapper[4931]: W1201 16:00:42.199497 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29d401e0_9a64_4f38_9ae1_00a8c55d9802.slice/crio-79b85bce7d9a926b958c762c05906d565f5bdbf110584fe32fe6e77b04ff6ed1 WatchSource:0}: Error finding container 79b85bce7d9a926b958c762c05906d565f5bdbf110584fe32fe6e77b04ff6ed1: Status 404 returned error can't find the container with id 79b85bce7d9a926b958c762c05906d565f5bdbf110584fe32fe6e77b04ff6ed1 Dec 01 16:00:42 crc kubenswrapper[4931]: I1201 16:00:42.496026 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p88bd/crc-debug-d8wt9" event={"ID":"29d401e0-9a64-4f38-9ae1-00a8c55d9802","Type":"ContainerStarted","Data":"79b85bce7d9a926b958c762c05906d565f5bdbf110584fe32fe6e77b04ff6ed1"} Dec 01 16:00:49 crc kubenswrapper[4931]: I1201 16:00:49.872132 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 16:00:49 crc kubenswrapper[4931]: I1201 16:00:49.872767 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 16:00:54 crc kubenswrapper[4931]: I1201 16:00:54.606340 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p88bd/crc-debug-d8wt9" event={"ID":"29d401e0-9a64-4f38-9ae1-00a8c55d9802","Type":"ContainerStarted","Data":"bbc15a9d22ab98bdf1fea6046732f7eb8f2e3516da125c14f3f93f03a4cdc30e"} Dec 01 16:00:54 crc kubenswrapper[4931]: I1201 16:00:54.617712 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p88bd/crc-debug-d8wt9" podStartSLOduration=1.6516156610000001 podStartE2EDuration="13.617693978s" podCreationTimestamp="2025-12-01 16:00:41 +0000 UTC" firstStartedPulling="2025-12-01 16:00:42.201085295 +0000 UTC m=+3588.626958962" lastFinishedPulling="2025-12-01 16:00:54.167163592 +0000 UTC m=+3600.593037279" observedRunningTime="2025-12-01 16:00:54.616893317 +0000 UTC m=+3601.042767004" watchObservedRunningTime="2025-12-01 16:00:54.617693978 +0000 UTC m=+3601.043567665" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.144022 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29410081-j4xz2"] Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.147664 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.158911 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29410081-j4xz2"] Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.297641 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-config-data\") pod \"keystone-cron-29410081-j4xz2\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.297773 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-fernet-keys\") pod \"keystone-cron-29410081-j4xz2\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.297885 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-combined-ca-bundle\") pod \"keystone-cron-29410081-j4xz2\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.297982 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vqgr\" (UniqueName: \"kubernetes.io/projected/736a9048-9855-44b1-aae0-9da840848c45-kube-api-access-7vqgr\") pod \"keystone-cron-29410081-j4xz2\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.399669 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-config-data\") pod \"keystone-cron-29410081-j4xz2\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.399707 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-fernet-keys\") pod \"keystone-cron-29410081-j4xz2\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.399762 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-combined-ca-bundle\") pod \"keystone-cron-29410081-j4xz2\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.399800 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vqgr\" (UniqueName: \"kubernetes.io/projected/736a9048-9855-44b1-aae0-9da840848c45-kube-api-access-7vqgr\") pod \"keystone-cron-29410081-j4xz2\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.405759 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-combined-ca-bundle\") pod \"keystone-cron-29410081-j4xz2\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.405821 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-config-data\") pod \"keystone-cron-29410081-j4xz2\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.405883 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-fernet-keys\") pod \"keystone-cron-29410081-j4xz2\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.415748 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vqgr\" (UniqueName: \"kubernetes.io/projected/736a9048-9855-44b1-aae0-9da840848c45-kube-api-access-7vqgr\") pod \"keystone-cron-29410081-j4xz2\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:00 crc kubenswrapper[4931]: I1201 16:01:00.471478 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:01 crc kubenswrapper[4931]: I1201 16:01:01.213071 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29410081-j4xz2"] Dec 01 16:01:01 crc kubenswrapper[4931]: W1201 16:01:01.220524 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod736a9048_9855_44b1_aae0_9da840848c45.slice/crio-eb8e1d393ac44b18263cd0eff87548c0b81cd5ea23e9205d9813a7fa1e150353 WatchSource:0}: Error finding container eb8e1d393ac44b18263cd0eff87548c0b81cd5ea23e9205d9813a7fa1e150353: Status 404 returned error can't find the container with id eb8e1d393ac44b18263cd0eff87548c0b81cd5ea23e9205d9813a7fa1e150353 Dec 01 16:01:01 crc kubenswrapper[4931]: I1201 16:01:01.668569 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410081-j4xz2" event={"ID":"736a9048-9855-44b1-aae0-9da840848c45","Type":"ContainerStarted","Data":"6cc6c165651f910bb89c160e03d307227f4e363f2f811276c7be0c21ffff1eb1"} Dec 01 16:01:01 crc kubenswrapper[4931]: I1201 16:01:01.668927 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410081-j4xz2" event={"ID":"736a9048-9855-44b1-aae0-9da840848c45","Type":"ContainerStarted","Data":"eb8e1d393ac44b18263cd0eff87548c0b81cd5ea23e9205d9813a7fa1e150353"} Dec 01 16:01:01 crc kubenswrapper[4931]: I1201 16:01:01.681906 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29410081-j4xz2" podStartSLOduration=1.681883405 podStartE2EDuration="1.681883405s" podCreationTimestamp="2025-12-01 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 16:01:01.680420806 +0000 UTC m=+3608.106294513" watchObservedRunningTime="2025-12-01 16:01:01.681883405 +0000 UTC m=+3608.107757112" Dec 01 16:01:03 crc kubenswrapper[4931]: I1201 16:01:03.686118 4931 generic.go:334] "Generic (PLEG): container finished" podID="736a9048-9855-44b1-aae0-9da840848c45" containerID="6cc6c165651f910bb89c160e03d307227f4e363f2f811276c7be0c21ffff1eb1" exitCode=0 Dec 01 16:01:03 crc kubenswrapper[4931]: I1201 16:01:03.686323 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410081-j4xz2" event={"ID":"736a9048-9855-44b1-aae0-9da840848c45","Type":"ContainerDied","Data":"6cc6c165651f910bb89c160e03d307227f4e363f2f811276c7be0c21ffff1eb1"} Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.096690 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.290746 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-config-data\") pod \"736a9048-9855-44b1-aae0-9da840848c45\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.290910 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vqgr\" (UniqueName: \"kubernetes.io/projected/736a9048-9855-44b1-aae0-9da840848c45-kube-api-access-7vqgr\") pod \"736a9048-9855-44b1-aae0-9da840848c45\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.290941 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-combined-ca-bundle\") pod \"736a9048-9855-44b1-aae0-9da840848c45\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.291039 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-fernet-keys\") pod \"736a9048-9855-44b1-aae0-9da840848c45\" (UID: \"736a9048-9855-44b1-aae0-9da840848c45\") " Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.304488 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "736a9048-9855-44b1-aae0-9da840848c45" (UID: "736a9048-9855-44b1-aae0-9da840848c45"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.315747 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736a9048-9855-44b1-aae0-9da840848c45-kube-api-access-7vqgr" (OuterVolumeSpecName: "kube-api-access-7vqgr") pod "736a9048-9855-44b1-aae0-9da840848c45" (UID: "736a9048-9855-44b1-aae0-9da840848c45"). InnerVolumeSpecName "kube-api-access-7vqgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.346012 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "736a9048-9855-44b1-aae0-9da840848c45" (UID: "736a9048-9855-44b1-aae0-9da840848c45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.360152 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-config-data" (OuterVolumeSpecName: "config-data") pod "736a9048-9855-44b1-aae0-9da840848c45" (UID: "736a9048-9855-44b1-aae0-9da840848c45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.393092 4931 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.393134 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vqgr\" (UniqueName: \"kubernetes.io/projected/736a9048-9855-44b1-aae0-9da840848c45-kube-api-access-7vqgr\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.393150 4931 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.393162 4931 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/736a9048-9855-44b1-aae0-9da840848c45-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.703528 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410081-j4xz2" event={"ID":"736a9048-9855-44b1-aae0-9da840848c45","Type":"ContainerDied","Data":"eb8e1d393ac44b18263cd0eff87548c0b81cd5ea23e9205d9813a7fa1e150353"} Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.703862 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb8e1d393ac44b18263cd0eff87548c0b81cd5ea23e9205d9813a7fa1e150353" Dec 01 16:01:05 crc kubenswrapper[4931]: I1201 16:01:05.703621 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410081-j4xz2" Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.527046 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5cmwr"] Dec 01 16:01:10 crc kubenswrapper[4931]: E1201 16:01:10.528987 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736a9048-9855-44b1-aae0-9da840848c45" containerName="keystone-cron" Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.529066 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="736a9048-9855-44b1-aae0-9da840848c45" containerName="keystone-cron" Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.529365 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="736a9048-9855-44b1-aae0-9da840848c45" containerName="keystone-cron" Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.531173 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.555808 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5cmwr"] Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.620899 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4jl7\" (UniqueName: \"kubernetes.io/projected/b4240085-00ab-4edf-b2ca-ae04ea170973-kube-api-access-d4jl7\") pod \"certified-operators-5cmwr\" (UID: \"b4240085-00ab-4edf-b2ca-ae04ea170973\") " pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.620998 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4240085-00ab-4edf-b2ca-ae04ea170973-utilities\") pod \"certified-operators-5cmwr\" (UID: \"b4240085-00ab-4edf-b2ca-ae04ea170973\") " pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.621060 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4240085-00ab-4edf-b2ca-ae04ea170973-catalog-content\") pod \"certified-operators-5cmwr\" (UID: \"b4240085-00ab-4edf-b2ca-ae04ea170973\") " pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.723466 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4240085-00ab-4edf-b2ca-ae04ea170973-utilities\") pod \"certified-operators-5cmwr\" (UID: \"b4240085-00ab-4edf-b2ca-ae04ea170973\") " pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.723545 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4240085-00ab-4edf-b2ca-ae04ea170973-catalog-content\") pod \"certified-operators-5cmwr\" (UID: \"b4240085-00ab-4edf-b2ca-ae04ea170973\") " pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.723640 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4jl7\" (UniqueName: \"kubernetes.io/projected/b4240085-00ab-4edf-b2ca-ae04ea170973-kube-api-access-d4jl7\") pod \"certified-operators-5cmwr\" (UID: \"b4240085-00ab-4edf-b2ca-ae04ea170973\") " pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.724212 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4240085-00ab-4edf-b2ca-ae04ea170973-utilities\") pod \"certified-operators-5cmwr\" (UID: \"b4240085-00ab-4edf-b2ca-ae04ea170973\") " pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.724237 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4240085-00ab-4edf-b2ca-ae04ea170973-catalog-content\") pod \"certified-operators-5cmwr\" (UID: \"b4240085-00ab-4edf-b2ca-ae04ea170973\") " pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:10 crc kubenswrapper[4931]: I1201 16:01:10.750612 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4jl7\" (UniqueName: \"kubernetes.io/projected/b4240085-00ab-4edf-b2ca-ae04ea170973-kube-api-access-d4jl7\") pod \"certified-operators-5cmwr\" (UID: \"b4240085-00ab-4edf-b2ca-ae04ea170973\") " pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:11 crc kubenswrapper[4931]: I1201 16:01:11.849631 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:12 crc kubenswrapper[4931]: I1201 16:01:12.340637 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5cmwr"] Dec 01 16:01:12 crc kubenswrapper[4931]: E1201 16:01:12.715101 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4240085_00ab_4edf_b2ca_ae04ea170973.slice/crio-254c7df0cba54d3767c5f196ed31bcbc1dda4721b643d31257430dafcba9997e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4240085_00ab_4edf_b2ca_ae04ea170973.slice/crio-conmon-254c7df0cba54d3767c5f196ed31bcbc1dda4721b643d31257430dafcba9997e.scope\": RecentStats: unable to find data in memory cache]" Dec 01 16:01:12 crc kubenswrapper[4931]: I1201 16:01:12.761005 4931 generic.go:334] "Generic (PLEG): container finished" podID="b4240085-00ab-4edf-b2ca-ae04ea170973" containerID="254c7df0cba54d3767c5f196ed31bcbc1dda4721b643d31257430dafcba9997e" exitCode=0 Dec 01 16:01:12 crc kubenswrapper[4931]: I1201 16:01:12.761050 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cmwr" event={"ID":"b4240085-00ab-4edf-b2ca-ae04ea170973","Type":"ContainerDied","Data":"254c7df0cba54d3767c5f196ed31bcbc1dda4721b643d31257430dafcba9997e"} Dec 01 16:01:12 crc kubenswrapper[4931]: I1201 16:01:12.761079 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cmwr" event={"ID":"b4240085-00ab-4edf-b2ca-ae04ea170973","Type":"ContainerStarted","Data":"910e81bd3cf4a65a514bf55bf15c1cf6fbae6532f3bc2d809a61b352d009d832"} Dec 01 16:01:19 crc kubenswrapper[4931]: I1201 16:01:19.846285 4931 generic.go:334] "Generic (PLEG): container finished" podID="b4240085-00ab-4edf-b2ca-ae04ea170973" containerID="e17a258c9d0e0d48d6109e236366fa6f5212aa4f060c3be48f372b566bb3b4e3" exitCode=0 Dec 01 16:01:19 crc kubenswrapper[4931]: I1201 16:01:19.846463 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cmwr" event={"ID":"b4240085-00ab-4edf-b2ca-ae04ea170973","Type":"ContainerDied","Data":"e17a258c9d0e0d48d6109e236366fa6f5212aa4f060c3be48f372b566bb3b4e3"} Dec 01 16:01:19 crc kubenswrapper[4931]: I1201 16:01:19.872537 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 16:01:19 crc kubenswrapper[4931]: I1201 16:01:19.872612 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 16:01:19 crc kubenswrapper[4931]: I1201 16:01:19.872668 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 16:01:19 crc kubenswrapper[4931]: I1201 16:01:19.874198 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d7a9470ce8c868229b979f2cb9e7efbcbadfca1c6e29e938ab705e714f043b38"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 16:01:19 crc kubenswrapper[4931]: I1201 16:01:19.874275 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://d7a9470ce8c868229b979f2cb9e7efbcbadfca1c6e29e938ab705e714f043b38" gracePeriod=600 Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.411597 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8v6nc"] Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.414806 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.450360 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8v6nc"] Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.540584 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd146ae7-76da-4156-9d84-e68c56a18fae-utilities\") pod \"community-operators-8v6nc\" (UID: \"fd146ae7-76da-4156-9d84-e68c56a18fae\") " pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.540725 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkhsr\" (UniqueName: \"kubernetes.io/projected/fd146ae7-76da-4156-9d84-e68c56a18fae-kube-api-access-tkhsr\") pod \"community-operators-8v6nc\" (UID: \"fd146ae7-76da-4156-9d84-e68c56a18fae\") " pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.540771 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd146ae7-76da-4156-9d84-e68c56a18fae-catalog-content\") pod \"community-operators-8v6nc\" (UID: \"fd146ae7-76da-4156-9d84-e68c56a18fae\") " pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.643575 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd146ae7-76da-4156-9d84-e68c56a18fae-catalog-content\") pod \"community-operators-8v6nc\" (UID: \"fd146ae7-76da-4156-9d84-e68c56a18fae\") " pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.643717 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd146ae7-76da-4156-9d84-e68c56a18fae-utilities\") pod \"community-operators-8v6nc\" (UID: \"fd146ae7-76da-4156-9d84-e68c56a18fae\") " pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.643825 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkhsr\" (UniqueName: \"kubernetes.io/projected/fd146ae7-76da-4156-9d84-e68c56a18fae-kube-api-access-tkhsr\") pod \"community-operators-8v6nc\" (UID: \"fd146ae7-76da-4156-9d84-e68c56a18fae\") " pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.644376 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd146ae7-76da-4156-9d84-e68c56a18fae-catalog-content\") pod \"community-operators-8v6nc\" (UID: \"fd146ae7-76da-4156-9d84-e68c56a18fae\") " pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.644531 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd146ae7-76da-4156-9d84-e68c56a18fae-utilities\") pod \"community-operators-8v6nc\" (UID: \"fd146ae7-76da-4156-9d84-e68c56a18fae\") " pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.665615 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkhsr\" (UniqueName: \"kubernetes.io/projected/fd146ae7-76da-4156-9d84-e68c56a18fae-kube-api-access-tkhsr\") pod \"community-operators-8v6nc\" (UID: \"fd146ae7-76da-4156-9d84-e68c56a18fae\") " pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.750980 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.871783 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="d7a9470ce8c868229b979f2cb9e7efbcbadfca1c6e29e938ab705e714f043b38" exitCode=0 Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.872051 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"d7a9470ce8c868229b979f2cb9e7efbcbadfca1c6e29e938ab705e714f043b38"} Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.872078 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc"} Dec 01 16:01:20 crc kubenswrapper[4931]: I1201 16:01:20.872097 4931 scope.go:117] "RemoveContainer" containerID="b42e433fe5895152503ae1841599c14659bfbb3b5c3e85e02199d4617554ad46" Dec 01 16:01:21 crc kubenswrapper[4931]: I1201 16:01:21.315439 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8v6nc"] Dec 01 16:01:21 crc kubenswrapper[4931]: I1201 16:01:21.890254 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5cmwr" event={"ID":"b4240085-00ab-4edf-b2ca-ae04ea170973","Type":"ContainerStarted","Data":"d61bac25896f899a932cb09a900f3682927e2445a097571beecb21826575d82d"} Dec 01 16:01:21 crc kubenswrapper[4931]: I1201 16:01:21.893774 4931 generic.go:334] "Generic (PLEG): container finished" podID="fd146ae7-76da-4156-9d84-e68c56a18fae" containerID="3a0f615fa97a4de643016836d464cac8bcd6a856919cc455c87168d5d0e48c2a" exitCode=0 Dec 01 16:01:21 crc kubenswrapper[4931]: I1201 16:01:21.893814 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6nc" event={"ID":"fd146ae7-76da-4156-9d84-e68c56a18fae","Type":"ContainerDied","Data":"3a0f615fa97a4de643016836d464cac8bcd6a856919cc455c87168d5d0e48c2a"} Dec 01 16:01:21 crc kubenswrapper[4931]: I1201 16:01:21.893839 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6nc" event={"ID":"fd146ae7-76da-4156-9d84-e68c56a18fae","Type":"ContainerStarted","Data":"9965e7da6a364e1b9da4c869791249d8045aae28342cf674478f82b4a5fd3f8f"} Dec 01 16:01:21 crc kubenswrapper[4931]: I1201 16:01:21.920959 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5cmwr" podStartSLOduration=3.6723091009999997 podStartE2EDuration="11.920941446s" podCreationTimestamp="2025-12-01 16:01:10 +0000 UTC" firstStartedPulling="2025-12-01 16:01:12.762930652 +0000 UTC m=+3619.188804319" lastFinishedPulling="2025-12-01 16:01:21.011562997 +0000 UTC m=+3627.437436664" observedRunningTime="2025-12-01 16:01:21.918823239 +0000 UTC m=+3628.344696906" watchObservedRunningTime="2025-12-01 16:01:21.920941446 +0000 UTC m=+3628.346815113" Dec 01 16:01:23 crc kubenswrapper[4931]: I1201 16:01:23.913919 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6nc" event={"ID":"fd146ae7-76da-4156-9d84-e68c56a18fae","Type":"ContainerStarted","Data":"e7fb52cfd0f0956569e79287e430ff2b1d04a2862e95ff8a6a71213c71c5d978"} Dec 01 16:01:24 crc kubenswrapper[4931]: I1201 16:01:24.924778 4931 generic.go:334] "Generic (PLEG): container finished" podID="fd146ae7-76da-4156-9d84-e68c56a18fae" containerID="e7fb52cfd0f0956569e79287e430ff2b1d04a2862e95ff8a6a71213c71c5d978" exitCode=0 Dec 01 16:01:24 crc kubenswrapper[4931]: I1201 16:01:24.924836 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6nc" event={"ID":"fd146ae7-76da-4156-9d84-e68c56a18fae","Type":"ContainerDied","Data":"e7fb52cfd0f0956569e79287e430ff2b1d04a2862e95ff8a6a71213c71c5d978"} Dec 01 16:01:26 crc kubenswrapper[4931]: I1201 16:01:26.944484 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6nc" event={"ID":"fd146ae7-76da-4156-9d84-e68c56a18fae","Type":"ContainerStarted","Data":"65daff6e4b37108e67817087c932d0f2877b9bf8454dd1f2a7eaa70d8bba1793"} Dec 01 16:01:26 crc kubenswrapper[4931]: I1201 16:01:26.964339 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8v6nc" podStartSLOduration=2.928623509 podStartE2EDuration="6.964320858s" podCreationTimestamp="2025-12-01 16:01:20 +0000 UTC" firstStartedPulling="2025-12-01 16:01:21.895784017 +0000 UTC m=+3628.321657684" lastFinishedPulling="2025-12-01 16:01:25.931481356 +0000 UTC m=+3632.357355033" observedRunningTime="2025-12-01 16:01:26.96068825 +0000 UTC m=+3633.386561917" watchObservedRunningTime="2025-12-01 16:01:26.964320858 +0000 UTC m=+3633.390194525" Dec 01 16:01:30 crc kubenswrapper[4931]: I1201 16:01:30.751697 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:30 crc kubenswrapper[4931]: I1201 16:01:30.752550 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:30 crc kubenswrapper[4931]: I1201 16:01:30.803997 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:31 crc kubenswrapper[4931]: I1201 16:01:31.851346 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:31 crc kubenswrapper[4931]: I1201 16:01:31.851778 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:31 crc kubenswrapper[4931]: I1201 16:01:31.910173 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.040989 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5cmwr" Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.101460 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5cmwr"] Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.166112 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4xwh"] Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.166427 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p4xwh" podUID="d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" containerName="registry-server" containerID="cri-o://690b014f49b18347548e87eacd9c0395fdaa5dcdae096626430a44f2ec7d0c12" gracePeriod=2 Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.664964 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.793910 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-catalog-content\") pod \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\" (UID: \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\") " Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.794020 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-utilities\") pod \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\" (UID: \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\") " Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.794059 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgngv\" (UniqueName: \"kubernetes.io/projected/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-kube-api-access-wgngv\") pod \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\" (UID: \"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda\") " Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.796341 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-utilities" (OuterVolumeSpecName: "utilities") pod "d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" (UID: "d7059bfc-3946-467c-9a1c-e1b6e0ddbfda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.816556 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-kube-api-access-wgngv" (OuterVolumeSpecName: "kube-api-access-wgngv") pod "d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" (UID: "d7059bfc-3946-467c-9a1c-e1b6e0ddbfda"). InnerVolumeSpecName "kube-api-access-wgngv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.874919 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" (UID: "d7059bfc-3946-467c-9a1c-e1b6e0ddbfda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.895977 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.896021 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:32 crc kubenswrapper[4931]: I1201 16:01:32.896032 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgngv\" (UniqueName: \"kubernetes.io/projected/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda-kube-api-access-wgngv\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.035871 4931 generic.go:334] "Generic (PLEG): container finished" podID="d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" containerID="690b014f49b18347548e87eacd9c0395fdaa5dcdae096626430a44f2ec7d0c12" exitCode=0 Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.036235 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4xwh" Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.036598 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4xwh" event={"ID":"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda","Type":"ContainerDied","Data":"690b014f49b18347548e87eacd9c0395fdaa5dcdae096626430a44f2ec7d0c12"} Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.036695 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4xwh" event={"ID":"d7059bfc-3946-467c-9a1c-e1b6e0ddbfda","Type":"ContainerDied","Data":"2dca7df5c5b38ae303f0926543eb39034cd641abfe6434e53c0d5a9b3a3abd2a"} Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.036721 4931 scope.go:117] "RemoveContainer" containerID="690b014f49b18347548e87eacd9c0395fdaa5dcdae096626430a44f2ec7d0c12" Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.074552 4931 scope.go:117] "RemoveContainer" containerID="c24a3461e9a8edd0d2941e2e5eb12fb04824588b23c0a55122eb8c07b60b3655" Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.117615 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4xwh"] Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.119786 4931 scope.go:117] "RemoveContainer" containerID="b429639f21010e4ae767c8c99c14d6d08361acf67db37745e50b612177af2c22" Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.129878 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p4xwh"] Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.160690 4931 scope.go:117] "RemoveContainer" containerID="690b014f49b18347548e87eacd9c0395fdaa5dcdae096626430a44f2ec7d0c12" Dec 01 16:01:33 crc kubenswrapper[4931]: E1201 16:01:33.161215 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"690b014f49b18347548e87eacd9c0395fdaa5dcdae096626430a44f2ec7d0c12\": container with ID starting with 690b014f49b18347548e87eacd9c0395fdaa5dcdae096626430a44f2ec7d0c12 not found: ID does not exist" containerID="690b014f49b18347548e87eacd9c0395fdaa5dcdae096626430a44f2ec7d0c12" Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.161248 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"690b014f49b18347548e87eacd9c0395fdaa5dcdae096626430a44f2ec7d0c12"} err="failed to get container status \"690b014f49b18347548e87eacd9c0395fdaa5dcdae096626430a44f2ec7d0c12\": rpc error: code = NotFound desc = could not find container \"690b014f49b18347548e87eacd9c0395fdaa5dcdae096626430a44f2ec7d0c12\": container with ID starting with 690b014f49b18347548e87eacd9c0395fdaa5dcdae096626430a44f2ec7d0c12 not found: ID does not exist" Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.161270 4931 scope.go:117] "RemoveContainer" containerID="c24a3461e9a8edd0d2941e2e5eb12fb04824588b23c0a55122eb8c07b60b3655" Dec 01 16:01:33 crc kubenswrapper[4931]: E1201 16:01:33.161631 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c24a3461e9a8edd0d2941e2e5eb12fb04824588b23c0a55122eb8c07b60b3655\": container with ID starting with c24a3461e9a8edd0d2941e2e5eb12fb04824588b23c0a55122eb8c07b60b3655 not found: ID does not exist" containerID="c24a3461e9a8edd0d2941e2e5eb12fb04824588b23c0a55122eb8c07b60b3655" Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.161661 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24a3461e9a8edd0d2941e2e5eb12fb04824588b23c0a55122eb8c07b60b3655"} err="failed to get container status \"c24a3461e9a8edd0d2941e2e5eb12fb04824588b23c0a55122eb8c07b60b3655\": rpc error: code = NotFound desc = could not find container \"c24a3461e9a8edd0d2941e2e5eb12fb04824588b23c0a55122eb8c07b60b3655\": container with ID starting with c24a3461e9a8edd0d2941e2e5eb12fb04824588b23c0a55122eb8c07b60b3655 not found: ID does not exist" Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.161679 4931 scope.go:117] "RemoveContainer" containerID="b429639f21010e4ae767c8c99c14d6d08361acf67db37745e50b612177af2c22" Dec 01 16:01:33 crc kubenswrapper[4931]: E1201 16:01:33.162369 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b429639f21010e4ae767c8c99c14d6d08361acf67db37745e50b612177af2c22\": container with ID starting with b429639f21010e4ae767c8c99c14d6d08361acf67db37745e50b612177af2c22 not found: ID does not exist" containerID="b429639f21010e4ae767c8c99c14d6d08361acf67db37745e50b612177af2c22" Dec 01 16:01:33 crc kubenswrapper[4931]: I1201 16:01:33.162412 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b429639f21010e4ae767c8c99c14d6d08361acf67db37745e50b612177af2c22"} err="failed to get container status \"b429639f21010e4ae767c8c99c14d6d08361acf67db37745e50b612177af2c22\": rpc error: code = NotFound desc = could not find container \"b429639f21010e4ae767c8c99c14d6d08361acf67db37745e50b612177af2c22\": container with ID starting with b429639f21010e4ae767c8c99c14d6d08361acf67db37745e50b612177af2c22 not found: ID does not exist" Dec 01 16:01:33 crc kubenswrapper[4931]: E1201 16:01:33.290851 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7059bfc_3946_467c_9a1c_e1b6e0ddbfda.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7059bfc_3946_467c_9a1c_e1b6e0ddbfda.slice/crio-2dca7df5c5b38ae303f0926543eb39034cd641abfe6434e53c0d5a9b3a3abd2a\": RecentStats: unable to find data in memory cache]" Dec 01 16:01:34 crc kubenswrapper[4931]: I1201 16:01:34.258797 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" path="/var/lib/kubelet/pods/d7059bfc-3946-467c-9a1c-e1b6e0ddbfda/volumes" Dec 01 16:01:36 crc kubenswrapper[4931]: I1201 16:01:36.072663 4931 generic.go:334] "Generic (PLEG): container finished" podID="29d401e0-9a64-4f38-9ae1-00a8c55d9802" containerID="bbc15a9d22ab98bdf1fea6046732f7eb8f2e3516da125c14f3f93f03a4cdc30e" exitCode=0 Dec 01 16:01:36 crc kubenswrapper[4931]: I1201 16:01:36.072846 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p88bd/crc-debug-d8wt9" event={"ID":"29d401e0-9a64-4f38-9ae1-00a8c55d9802","Type":"ContainerDied","Data":"bbc15a9d22ab98bdf1fea6046732f7eb8f2e3516da125c14f3f93f03a4cdc30e"} Dec 01 16:01:37 crc kubenswrapper[4931]: I1201 16:01:37.206836 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/crc-debug-d8wt9" Dec 01 16:01:37 crc kubenswrapper[4931]: I1201 16:01:37.256573 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p88bd/crc-debug-d8wt9"] Dec 01 16:01:37 crc kubenswrapper[4931]: I1201 16:01:37.269313 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p88bd/crc-debug-d8wt9"] Dec 01 16:01:37 crc kubenswrapper[4931]: I1201 16:01:37.295680 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pj6q\" (UniqueName: \"kubernetes.io/projected/29d401e0-9a64-4f38-9ae1-00a8c55d9802-kube-api-access-7pj6q\") pod \"29d401e0-9a64-4f38-9ae1-00a8c55d9802\" (UID: \"29d401e0-9a64-4f38-9ae1-00a8c55d9802\") " Dec 01 16:01:37 crc kubenswrapper[4931]: I1201 16:01:37.296438 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29d401e0-9a64-4f38-9ae1-00a8c55d9802-host\") pod \"29d401e0-9a64-4f38-9ae1-00a8c55d9802\" (UID: \"29d401e0-9a64-4f38-9ae1-00a8c55d9802\") " Dec 01 16:01:37 crc kubenswrapper[4931]: I1201 16:01:37.296543 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29d401e0-9a64-4f38-9ae1-00a8c55d9802-host" (OuterVolumeSpecName: "host") pod "29d401e0-9a64-4f38-9ae1-00a8c55d9802" (UID: "29d401e0-9a64-4f38-9ae1-00a8c55d9802"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 16:01:37 crc kubenswrapper[4931]: I1201 16:01:37.297560 4931 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29d401e0-9a64-4f38-9ae1-00a8c55d9802-host\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:37 crc kubenswrapper[4931]: I1201 16:01:37.301737 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d401e0-9a64-4f38-9ae1-00a8c55d9802-kube-api-access-7pj6q" (OuterVolumeSpecName: "kube-api-access-7pj6q") pod "29d401e0-9a64-4f38-9ae1-00a8c55d9802" (UID: "29d401e0-9a64-4f38-9ae1-00a8c55d9802"). InnerVolumeSpecName "kube-api-access-7pj6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:01:37 crc kubenswrapper[4931]: I1201 16:01:37.399374 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pj6q\" (UniqueName: \"kubernetes.io/projected/29d401e0-9a64-4f38-9ae1-00a8c55d9802-kube-api-access-7pj6q\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.096476 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79b85bce7d9a926b958c762c05906d565f5bdbf110584fe32fe6e77b04ff6ed1" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.096560 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/crc-debug-d8wt9" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.257044 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d401e0-9a64-4f38-9ae1-00a8c55d9802" path="/var/lib/kubelet/pods/29d401e0-9a64-4f38-9ae1-00a8c55d9802/volumes" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.431802 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p88bd/crc-debug-qjwwb"] Dec 01 16:01:38 crc kubenswrapper[4931]: E1201 16:01:38.432488 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" containerName="extract-content" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.432580 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" containerName="extract-content" Dec 01 16:01:38 crc kubenswrapper[4931]: E1201 16:01:38.432693 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" containerName="extract-utilities" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.432767 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" containerName="extract-utilities" Dec 01 16:01:38 crc kubenswrapper[4931]: E1201 16:01:38.432856 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d401e0-9a64-4f38-9ae1-00a8c55d9802" containerName="container-00" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.432928 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d401e0-9a64-4f38-9ae1-00a8c55d9802" containerName="container-00" Dec 01 16:01:38 crc kubenswrapper[4931]: E1201 16:01:38.433008 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" containerName="registry-server" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.433076 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" containerName="registry-server" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.433366 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7059bfc-3946-467c-9a1c-e1b6e0ddbfda" containerName="registry-server" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.433486 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d401e0-9a64-4f38-9ae1-00a8c55d9802" containerName="container-00" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.434270 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/crc-debug-qjwwb" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.521942 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcab9801-ab8a-40fd-aa3a-0f68fd056ab2-host\") pod \"crc-debug-qjwwb\" (UID: \"dcab9801-ab8a-40fd-aa3a-0f68fd056ab2\") " pod="openshift-must-gather-p88bd/crc-debug-qjwwb" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.522036 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h75r\" (UniqueName: \"kubernetes.io/projected/dcab9801-ab8a-40fd-aa3a-0f68fd056ab2-kube-api-access-7h75r\") pod \"crc-debug-qjwwb\" (UID: \"dcab9801-ab8a-40fd-aa3a-0f68fd056ab2\") " pod="openshift-must-gather-p88bd/crc-debug-qjwwb" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.623322 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h75r\" (UniqueName: \"kubernetes.io/projected/dcab9801-ab8a-40fd-aa3a-0f68fd056ab2-kube-api-access-7h75r\") pod \"crc-debug-qjwwb\" (UID: \"dcab9801-ab8a-40fd-aa3a-0f68fd056ab2\") " pod="openshift-must-gather-p88bd/crc-debug-qjwwb" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.623602 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcab9801-ab8a-40fd-aa3a-0f68fd056ab2-host\") pod \"crc-debug-qjwwb\" (UID: \"dcab9801-ab8a-40fd-aa3a-0f68fd056ab2\") " pod="openshift-must-gather-p88bd/crc-debug-qjwwb" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.623734 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcab9801-ab8a-40fd-aa3a-0f68fd056ab2-host\") pod \"crc-debug-qjwwb\" (UID: \"dcab9801-ab8a-40fd-aa3a-0f68fd056ab2\") " pod="openshift-must-gather-p88bd/crc-debug-qjwwb" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.640561 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h75r\" (UniqueName: \"kubernetes.io/projected/dcab9801-ab8a-40fd-aa3a-0f68fd056ab2-kube-api-access-7h75r\") pod \"crc-debug-qjwwb\" (UID: \"dcab9801-ab8a-40fd-aa3a-0f68fd056ab2\") " pod="openshift-must-gather-p88bd/crc-debug-qjwwb" Dec 01 16:01:38 crc kubenswrapper[4931]: I1201 16:01:38.770354 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/crc-debug-qjwwb" Dec 01 16:01:38 crc kubenswrapper[4931]: W1201 16:01:38.807861 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcab9801_ab8a_40fd_aa3a_0f68fd056ab2.slice/crio-38393467ce2aea3e434f4f17afdf46e60a16c0bef243d13da2bdb1d3d9c1fe56 WatchSource:0}: Error finding container 38393467ce2aea3e434f4f17afdf46e60a16c0bef243d13da2bdb1d3d9c1fe56: Status 404 returned error can't find the container with id 38393467ce2aea3e434f4f17afdf46e60a16c0bef243d13da2bdb1d3d9c1fe56 Dec 01 16:01:39 crc kubenswrapper[4931]: I1201 16:01:39.109669 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p88bd/crc-debug-qjwwb" event={"ID":"dcab9801-ab8a-40fd-aa3a-0f68fd056ab2","Type":"ContainerStarted","Data":"55a4d357a24d3405fd1d1e627a9990193564bdc847b32e93d5db6777b95a7e73"} Dec 01 16:01:39 crc kubenswrapper[4931]: I1201 16:01:39.109748 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p88bd/crc-debug-qjwwb" event={"ID":"dcab9801-ab8a-40fd-aa3a-0f68fd056ab2","Type":"ContainerStarted","Data":"38393467ce2aea3e434f4f17afdf46e60a16c0bef243d13da2bdb1d3d9c1fe56"} Dec 01 16:01:39 crc kubenswrapper[4931]: I1201 16:01:39.141758 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p88bd/crc-debug-qjwwb" podStartSLOduration=1.141734623 podStartE2EDuration="1.141734623s" podCreationTimestamp="2025-12-01 16:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 16:01:39.130662774 +0000 UTC m=+3645.556536461" watchObservedRunningTime="2025-12-01 16:01:39.141734623 +0000 UTC m=+3645.567608300" Dec 01 16:01:40 crc kubenswrapper[4931]: I1201 16:01:40.122634 4931 generic.go:334] "Generic (PLEG): container finished" podID="dcab9801-ab8a-40fd-aa3a-0f68fd056ab2" containerID="55a4d357a24d3405fd1d1e627a9990193564bdc847b32e93d5db6777b95a7e73" exitCode=0 Dec 01 16:01:40 crc kubenswrapper[4931]: I1201 16:01:40.122725 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p88bd/crc-debug-qjwwb" event={"ID":"dcab9801-ab8a-40fd-aa3a-0f68fd056ab2","Type":"ContainerDied","Data":"55a4d357a24d3405fd1d1e627a9990193564bdc847b32e93d5db6777b95a7e73"} Dec 01 16:01:40 crc kubenswrapper[4931]: I1201 16:01:40.829567 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:40 crc kubenswrapper[4931]: I1201 16:01:40.915151 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8v6nc"] Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.132934 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8v6nc" podUID="fd146ae7-76da-4156-9d84-e68c56a18fae" containerName="registry-server" containerID="cri-o://65daff6e4b37108e67817087c932d0f2877b9bf8454dd1f2a7eaa70d8bba1793" gracePeriod=2 Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.336227 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/crc-debug-qjwwb" Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.373609 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p88bd/crc-debug-qjwwb"] Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.388026 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p88bd/crc-debug-qjwwb"] Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.429782 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcab9801-ab8a-40fd-aa3a-0f68fd056ab2-host\") pod \"dcab9801-ab8a-40fd-aa3a-0f68fd056ab2\" (UID: \"dcab9801-ab8a-40fd-aa3a-0f68fd056ab2\") " Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.429880 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h75r\" (UniqueName: \"kubernetes.io/projected/dcab9801-ab8a-40fd-aa3a-0f68fd056ab2-kube-api-access-7h75r\") pod \"dcab9801-ab8a-40fd-aa3a-0f68fd056ab2\" (UID: \"dcab9801-ab8a-40fd-aa3a-0f68fd056ab2\") " Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.430417 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dcab9801-ab8a-40fd-aa3a-0f68fd056ab2-host" (OuterVolumeSpecName: "host") pod "dcab9801-ab8a-40fd-aa3a-0f68fd056ab2" (UID: "dcab9801-ab8a-40fd-aa3a-0f68fd056ab2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.433257 4931 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcab9801-ab8a-40fd-aa3a-0f68fd056ab2-host\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.437123 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcab9801-ab8a-40fd-aa3a-0f68fd056ab2-kube-api-access-7h75r" (OuterVolumeSpecName: "kube-api-access-7h75r") pod "dcab9801-ab8a-40fd-aa3a-0f68fd056ab2" (UID: "dcab9801-ab8a-40fd-aa3a-0f68fd056ab2"). InnerVolumeSpecName "kube-api-access-7h75r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.534902 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h75r\" (UniqueName: \"kubernetes.io/projected/dcab9801-ab8a-40fd-aa3a-0f68fd056ab2-kube-api-access-7h75r\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.665727 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.840420 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd146ae7-76da-4156-9d84-e68c56a18fae-utilities\") pod \"fd146ae7-76da-4156-9d84-e68c56a18fae\" (UID: \"fd146ae7-76da-4156-9d84-e68c56a18fae\") " Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.841139 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd146ae7-76da-4156-9d84-e68c56a18fae-catalog-content\") pod \"fd146ae7-76da-4156-9d84-e68c56a18fae\" (UID: \"fd146ae7-76da-4156-9d84-e68c56a18fae\") " Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.841244 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd146ae7-76da-4156-9d84-e68c56a18fae-utilities" (OuterVolumeSpecName: "utilities") pod "fd146ae7-76da-4156-9d84-e68c56a18fae" (UID: "fd146ae7-76da-4156-9d84-e68c56a18fae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.841284 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkhsr\" (UniqueName: \"kubernetes.io/projected/fd146ae7-76da-4156-9d84-e68c56a18fae-kube-api-access-tkhsr\") pod \"fd146ae7-76da-4156-9d84-e68c56a18fae\" (UID: \"fd146ae7-76da-4156-9d84-e68c56a18fae\") " Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.842691 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd146ae7-76da-4156-9d84-e68c56a18fae-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.845647 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd146ae7-76da-4156-9d84-e68c56a18fae-kube-api-access-tkhsr" (OuterVolumeSpecName: "kube-api-access-tkhsr") pod "fd146ae7-76da-4156-9d84-e68c56a18fae" (UID: "fd146ae7-76da-4156-9d84-e68c56a18fae"). InnerVolumeSpecName "kube-api-access-tkhsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.909829 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd146ae7-76da-4156-9d84-e68c56a18fae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd146ae7-76da-4156-9d84-e68c56a18fae" (UID: "fd146ae7-76da-4156-9d84-e68c56a18fae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.944807 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd146ae7-76da-4156-9d84-e68c56a18fae-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:41 crc kubenswrapper[4931]: I1201 16:01:41.944843 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkhsr\" (UniqueName: \"kubernetes.io/projected/fd146ae7-76da-4156-9d84-e68c56a18fae-kube-api-access-tkhsr\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.143877 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38393467ce2aea3e434f4f17afdf46e60a16c0bef243d13da2bdb1d3d9c1fe56" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.145355 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/crc-debug-qjwwb" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.146207 4931 generic.go:334] "Generic (PLEG): container finished" podID="fd146ae7-76da-4156-9d84-e68c56a18fae" containerID="65daff6e4b37108e67817087c932d0f2877b9bf8454dd1f2a7eaa70d8bba1793" exitCode=0 Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.146235 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8v6nc" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.146257 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6nc" event={"ID":"fd146ae7-76da-4156-9d84-e68c56a18fae","Type":"ContainerDied","Data":"65daff6e4b37108e67817087c932d0f2877b9bf8454dd1f2a7eaa70d8bba1793"} Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.146290 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6nc" event={"ID":"fd146ae7-76da-4156-9d84-e68c56a18fae","Type":"ContainerDied","Data":"9965e7da6a364e1b9da4c869791249d8045aae28342cf674478f82b4a5fd3f8f"} Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.146311 4931 scope.go:117] "RemoveContainer" containerID="65daff6e4b37108e67817087c932d0f2877b9bf8454dd1f2a7eaa70d8bba1793" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.202361 4931 scope.go:117] "RemoveContainer" containerID="e7fb52cfd0f0956569e79287e430ff2b1d04a2862e95ff8a6a71213c71c5d978" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.227345 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8v6nc"] Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.242923 4931 scope.go:117] "RemoveContainer" containerID="3a0f615fa97a4de643016836d464cac8bcd6a856919cc455c87168d5d0e48c2a" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.254379 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcab9801-ab8a-40fd-aa3a-0f68fd056ab2" path="/var/lib/kubelet/pods/dcab9801-ab8a-40fd-aa3a-0f68fd056ab2/volumes" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.255194 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8v6nc"] Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.277005 4931 scope.go:117] "RemoveContainer" containerID="65daff6e4b37108e67817087c932d0f2877b9bf8454dd1f2a7eaa70d8bba1793" Dec 01 16:01:42 crc kubenswrapper[4931]: E1201 16:01:42.277434 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65daff6e4b37108e67817087c932d0f2877b9bf8454dd1f2a7eaa70d8bba1793\": container with ID starting with 65daff6e4b37108e67817087c932d0f2877b9bf8454dd1f2a7eaa70d8bba1793 not found: ID does not exist" containerID="65daff6e4b37108e67817087c932d0f2877b9bf8454dd1f2a7eaa70d8bba1793" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.277480 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65daff6e4b37108e67817087c932d0f2877b9bf8454dd1f2a7eaa70d8bba1793"} err="failed to get container status \"65daff6e4b37108e67817087c932d0f2877b9bf8454dd1f2a7eaa70d8bba1793\": rpc error: code = NotFound desc = could not find container \"65daff6e4b37108e67817087c932d0f2877b9bf8454dd1f2a7eaa70d8bba1793\": container with ID starting with 65daff6e4b37108e67817087c932d0f2877b9bf8454dd1f2a7eaa70d8bba1793 not found: ID does not exist" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.277509 4931 scope.go:117] "RemoveContainer" containerID="e7fb52cfd0f0956569e79287e430ff2b1d04a2862e95ff8a6a71213c71c5d978" Dec 01 16:01:42 crc kubenswrapper[4931]: E1201 16:01:42.277824 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7fb52cfd0f0956569e79287e430ff2b1d04a2862e95ff8a6a71213c71c5d978\": container with ID starting with e7fb52cfd0f0956569e79287e430ff2b1d04a2862e95ff8a6a71213c71c5d978 not found: ID does not exist" containerID="e7fb52cfd0f0956569e79287e430ff2b1d04a2862e95ff8a6a71213c71c5d978" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.277864 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7fb52cfd0f0956569e79287e430ff2b1d04a2862e95ff8a6a71213c71c5d978"} err="failed to get container status \"e7fb52cfd0f0956569e79287e430ff2b1d04a2862e95ff8a6a71213c71c5d978\": rpc error: code = NotFound desc = could not find container \"e7fb52cfd0f0956569e79287e430ff2b1d04a2862e95ff8a6a71213c71c5d978\": container with ID starting with e7fb52cfd0f0956569e79287e430ff2b1d04a2862e95ff8a6a71213c71c5d978 not found: ID does not exist" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.277892 4931 scope.go:117] "RemoveContainer" containerID="3a0f615fa97a4de643016836d464cac8bcd6a856919cc455c87168d5d0e48c2a" Dec 01 16:01:42 crc kubenswrapper[4931]: E1201 16:01:42.278154 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a0f615fa97a4de643016836d464cac8bcd6a856919cc455c87168d5d0e48c2a\": container with ID starting with 3a0f615fa97a4de643016836d464cac8bcd6a856919cc455c87168d5d0e48c2a not found: ID does not exist" containerID="3a0f615fa97a4de643016836d464cac8bcd6a856919cc455c87168d5d0e48c2a" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.278187 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a0f615fa97a4de643016836d464cac8bcd6a856919cc455c87168d5d0e48c2a"} err="failed to get container status \"3a0f615fa97a4de643016836d464cac8bcd6a856919cc455c87168d5d0e48c2a\": rpc error: code = NotFound desc = could not find container \"3a0f615fa97a4de643016836d464cac8bcd6a856919cc455c87168d5d0e48c2a\": container with ID starting with 3a0f615fa97a4de643016836d464cac8bcd6a856919cc455c87168d5d0e48c2a not found: ID does not exist" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.596893 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p88bd/crc-debug-fvnxw"] Dec 01 16:01:42 crc kubenswrapper[4931]: E1201 16:01:42.597555 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd146ae7-76da-4156-9d84-e68c56a18fae" containerName="extract-content" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.597591 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd146ae7-76da-4156-9d84-e68c56a18fae" containerName="extract-content" Dec 01 16:01:42 crc kubenswrapper[4931]: E1201 16:01:42.597613 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd146ae7-76da-4156-9d84-e68c56a18fae" containerName="extract-utilities" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.597626 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd146ae7-76da-4156-9d84-e68c56a18fae" containerName="extract-utilities" Dec 01 16:01:42 crc kubenswrapper[4931]: E1201 16:01:42.597662 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcab9801-ab8a-40fd-aa3a-0f68fd056ab2" containerName="container-00" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.597676 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcab9801-ab8a-40fd-aa3a-0f68fd056ab2" containerName="container-00" Dec 01 16:01:42 crc kubenswrapper[4931]: E1201 16:01:42.597708 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd146ae7-76da-4156-9d84-e68c56a18fae" containerName="registry-server" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.597720 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd146ae7-76da-4156-9d84-e68c56a18fae" containerName="registry-server" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.598101 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcab9801-ab8a-40fd-aa3a-0f68fd056ab2" containerName="container-00" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.598140 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd146ae7-76da-4156-9d84-e68c56a18fae" containerName="registry-server" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.599550 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/crc-debug-fvnxw" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.763973 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45e0e29b-12ee-4938-9b6f-b9c2389a95fe-host\") pod \"crc-debug-fvnxw\" (UID: \"45e0e29b-12ee-4938-9b6f-b9c2389a95fe\") " pod="openshift-must-gather-p88bd/crc-debug-fvnxw" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.764246 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2999k\" (UniqueName: \"kubernetes.io/projected/45e0e29b-12ee-4938-9b6f-b9c2389a95fe-kube-api-access-2999k\") pod \"crc-debug-fvnxw\" (UID: \"45e0e29b-12ee-4938-9b6f-b9c2389a95fe\") " pod="openshift-must-gather-p88bd/crc-debug-fvnxw" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.866286 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45e0e29b-12ee-4938-9b6f-b9c2389a95fe-host\") pod \"crc-debug-fvnxw\" (UID: \"45e0e29b-12ee-4938-9b6f-b9c2389a95fe\") " pod="openshift-must-gather-p88bd/crc-debug-fvnxw" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.866431 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2999k\" (UniqueName: \"kubernetes.io/projected/45e0e29b-12ee-4938-9b6f-b9c2389a95fe-kube-api-access-2999k\") pod \"crc-debug-fvnxw\" (UID: \"45e0e29b-12ee-4938-9b6f-b9c2389a95fe\") " pod="openshift-must-gather-p88bd/crc-debug-fvnxw" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.866471 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45e0e29b-12ee-4938-9b6f-b9c2389a95fe-host\") pod \"crc-debug-fvnxw\" (UID: \"45e0e29b-12ee-4938-9b6f-b9c2389a95fe\") " pod="openshift-must-gather-p88bd/crc-debug-fvnxw" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.888999 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2999k\" (UniqueName: \"kubernetes.io/projected/45e0e29b-12ee-4938-9b6f-b9c2389a95fe-kube-api-access-2999k\") pod \"crc-debug-fvnxw\" (UID: \"45e0e29b-12ee-4938-9b6f-b9c2389a95fe\") " pod="openshift-must-gather-p88bd/crc-debug-fvnxw" Dec 01 16:01:42 crc kubenswrapper[4931]: I1201 16:01:42.925183 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/crc-debug-fvnxw" Dec 01 16:01:42 crc kubenswrapper[4931]: W1201 16:01:42.961476 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45e0e29b_12ee_4938_9b6f_b9c2389a95fe.slice/crio-125c229ee95e147d9245fc917e1a6c2ce3fb258cd16a2c65beac5347aacca973 WatchSource:0}: Error finding container 125c229ee95e147d9245fc917e1a6c2ce3fb258cd16a2c65beac5347aacca973: Status 404 returned error can't find the container with id 125c229ee95e147d9245fc917e1a6c2ce3fb258cd16a2c65beac5347aacca973 Dec 01 16:01:43 crc kubenswrapper[4931]: I1201 16:01:43.159164 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p88bd/crc-debug-fvnxw" event={"ID":"45e0e29b-12ee-4938-9b6f-b9c2389a95fe","Type":"ContainerStarted","Data":"7ef2fef68a10addc021b804bb152778a996dce28f34da94aeeab2eea48f6ad49"} Dec 01 16:01:43 crc kubenswrapper[4931]: I1201 16:01:43.159222 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p88bd/crc-debug-fvnxw" event={"ID":"45e0e29b-12ee-4938-9b6f-b9c2389a95fe","Type":"ContainerStarted","Data":"125c229ee95e147d9245fc917e1a6c2ce3fb258cd16a2c65beac5347aacca973"} Dec 01 16:01:43 crc kubenswrapper[4931]: I1201 16:01:43.187418 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p88bd/crc-debug-fvnxw" podStartSLOduration=1.187377301 podStartE2EDuration="1.187377301s" podCreationTimestamp="2025-12-01 16:01:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 16:01:43.179531809 +0000 UTC m=+3649.605405496" watchObservedRunningTime="2025-12-01 16:01:43.187377301 +0000 UTC m=+3649.613250968" Dec 01 16:01:44 crc kubenswrapper[4931]: I1201 16:01:44.178209 4931 generic.go:334] "Generic (PLEG): container finished" podID="45e0e29b-12ee-4938-9b6f-b9c2389a95fe" containerID="7ef2fef68a10addc021b804bb152778a996dce28f34da94aeeab2eea48f6ad49" exitCode=0 Dec 01 16:01:44 crc kubenswrapper[4931]: I1201 16:01:44.178271 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p88bd/crc-debug-fvnxw" event={"ID":"45e0e29b-12ee-4938-9b6f-b9c2389a95fe","Type":"ContainerDied","Data":"7ef2fef68a10addc021b804bb152778a996dce28f34da94aeeab2eea48f6ad49"} Dec 01 16:01:44 crc kubenswrapper[4931]: I1201 16:01:44.261948 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd146ae7-76da-4156-9d84-e68c56a18fae" path="/var/lib/kubelet/pods/fd146ae7-76da-4156-9d84-e68c56a18fae/volumes" Dec 01 16:01:45 crc kubenswrapper[4931]: I1201 16:01:45.300516 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/crc-debug-fvnxw" Dec 01 16:01:45 crc kubenswrapper[4931]: I1201 16:01:45.322636 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45e0e29b-12ee-4938-9b6f-b9c2389a95fe-host\") pod \"45e0e29b-12ee-4938-9b6f-b9c2389a95fe\" (UID: \"45e0e29b-12ee-4938-9b6f-b9c2389a95fe\") " Dec 01 16:01:45 crc kubenswrapper[4931]: I1201 16:01:45.322760 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2999k\" (UniqueName: \"kubernetes.io/projected/45e0e29b-12ee-4938-9b6f-b9c2389a95fe-kube-api-access-2999k\") pod \"45e0e29b-12ee-4938-9b6f-b9c2389a95fe\" (UID: \"45e0e29b-12ee-4938-9b6f-b9c2389a95fe\") " Dec 01 16:01:45 crc kubenswrapper[4931]: I1201 16:01:45.324371 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45e0e29b-12ee-4938-9b6f-b9c2389a95fe-host" (OuterVolumeSpecName: "host") pod "45e0e29b-12ee-4938-9b6f-b9c2389a95fe" (UID: "45e0e29b-12ee-4938-9b6f-b9c2389a95fe"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 16:01:45 crc kubenswrapper[4931]: I1201 16:01:45.335524 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p88bd/crc-debug-fvnxw"] Dec 01 16:01:45 crc kubenswrapper[4931]: I1201 16:01:45.336953 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e0e29b-12ee-4938-9b6f-b9c2389a95fe-kube-api-access-2999k" (OuterVolumeSpecName: "kube-api-access-2999k") pod "45e0e29b-12ee-4938-9b6f-b9c2389a95fe" (UID: "45e0e29b-12ee-4938-9b6f-b9c2389a95fe"). InnerVolumeSpecName "kube-api-access-2999k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:01:45 crc kubenswrapper[4931]: I1201 16:01:45.344675 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p88bd/crc-debug-fvnxw"] Dec 01 16:01:45 crc kubenswrapper[4931]: I1201 16:01:45.424811 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2999k\" (UniqueName: \"kubernetes.io/projected/45e0e29b-12ee-4938-9b6f-b9c2389a95fe-kube-api-access-2999k\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:45 crc kubenswrapper[4931]: I1201 16:01:45.425106 4931 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/45e0e29b-12ee-4938-9b6f-b9c2389a95fe-host\") on node \"crc\" DevicePath \"\"" Dec 01 16:01:46 crc kubenswrapper[4931]: I1201 16:01:46.196244 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="125c229ee95e147d9245fc917e1a6c2ce3fb258cd16a2c65beac5347aacca973" Dec 01 16:01:46 crc kubenswrapper[4931]: I1201 16:01:46.196286 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/crc-debug-fvnxw" Dec 01 16:01:46 crc kubenswrapper[4931]: I1201 16:01:46.260464 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e0e29b-12ee-4938-9b6f-b9c2389a95fe" path="/var/lib/kubelet/pods/45e0e29b-12ee-4938-9b6f-b9c2389a95fe/volumes" Dec 01 16:02:00 crc kubenswrapper[4931]: I1201 16:02:00.020855 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-779b959886-w78q9_93a16952-5c17-428c-b560-4a661b8b5416/barbican-api/0.log" Dec 01 16:02:00 crc kubenswrapper[4931]: I1201 16:02:00.138128 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-779b959886-w78q9_93a16952-5c17-428c-b560-4a661b8b5416/barbican-api-log/0.log" Dec 01 16:02:00 crc kubenswrapper[4931]: I1201 16:02:00.178813 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6765f56b8d-pn88t_05a83efe-a056-4531-b9f3-c6c4f87a9cdb/barbican-keystone-listener/0.log" Dec 01 16:02:00 crc kubenswrapper[4931]: I1201 16:02:00.218778 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6765f56b8d-pn88t_05a83efe-a056-4531-b9f3-c6c4f87a9cdb/barbican-keystone-listener-log/0.log" Dec 01 16:02:00 crc kubenswrapper[4931]: I1201 16:02:00.383298 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76d6fd8967-rrlqd_4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5/barbican-worker/0.log" Dec 01 16:02:00 crc kubenswrapper[4931]: I1201 16:02:00.416887 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76d6fd8967-rrlqd_4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5/barbican-worker-log/0.log" Dec 01 16:02:00 crc kubenswrapper[4931]: I1201 16:02:00.551238 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn_b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:00 crc kubenswrapper[4931]: I1201 16:02:00.615449 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_de8e3974-4d05-40e2-9306-01dd34663e53/ceilometer-central-agent/0.log" Dec 01 16:02:00 crc kubenswrapper[4931]: I1201 16:02:00.664371 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_de8e3974-4d05-40e2-9306-01dd34663e53/ceilometer-notification-agent/0.log" Dec 01 16:02:00 crc kubenswrapper[4931]: I1201 16:02:00.766248 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_de8e3974-4d05-40e2-9306-01dd34663e53/sg-core/0.log" Dec 01 16:02:00 crc kubenswrapper[4931]: I1201 16:02:00.770664 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_de8e3974-4d05-40e2-9306-01dd34663e53/proxy-httpd/0.log" Dec 01 16:02:00 crc kubenswrapper[4931]: I1201 16:02:00.864874 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5d91c4a2-739a-4533-a21b-9aa362069d32/cinder-api/0.log" Dec 01 16:02:00 crc kubenswrapper[4931]: I1201 16:02:00.934128 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5d91c4a2-739a-4533-a21b-9aa362069d32/cinder-api-log/0.log" Dec 01 16:02:01 crc kubenswrapper[4931]: I1201 16:02:01.079978 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2cae1c45-3e2d-4df6-93a6-b133953bdce0/probe/0.log" Dec 01 16:02:01 crc kubenswrapper[4931]: I1201 16:02:01.086590 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2cae1c45-3e2d-4df6-93a6-b133953bdce0/cinder-scheduler/0.log" Dec 01 16:02:01 crc kubenswrapper[4931]: I1201 16:02:01.197602 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-65xbv_ff92bfe2-2afc-4cc2-9317-db96b912117c/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:01 crc kubenswrapper[4931]: I1201 16:02:01.303530 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4_ae337b8b-ad01-493f-9471-aec15d221507/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:01 crc kubenswrapper[4931]: I1201 16:02:01.417494 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-w6hcc_e71b5ae6-3b87-43c4-839b-350df6114a20/init/0.log" Dec 01 16:02:01 crc kubenswrapper[4931]: I1201 16:02:01.587030 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-w6hcc_e71b5ae6-3b87-43c4-839b-350df6114a20/dnsmasq-dns/0.log" Dec 01 16:02:01 crc kubenswrapper[4931]: I1201 16:02:01.600884 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-w6hcc_e71b5ae6-3b87-43c4-839b-350df6114a20/init/0.log" Dec 01 16:02:01 crc kubenswrapper[4931]: I1201 16:02:01.634342 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d_3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:01 crc kubenswrapper[4931]: I1201 16:02:01.811071 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2316f8ea-3789-4702-91c2-a44da618bb8d/glance-httpd/0.log" Dec 01 16:02:01 crc kubenswrapper[4931]: I1201 16:02:01.841979 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2316f8ea-3789-4702-91c2-a44da618bb8d/glance-log/0.log" Dec 01 16:02:01 crc kubenswrapper[4931]: I1201 16:02:01.998148 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ab67b9e9-4315-4390-b414-89b215ad823b/glance-httpd/0.log" Dec 01 16:02:02 crc kubenswrapper[4931]: I1201 16:02:02.039277 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ab67b9e9-4315-4390-b414-89b215ad823b/glance-log/0.log" Dec 01 16:02:02 crc kubenswrapper[4931]: I1201 16:02:02.138727 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-65c944c654-l6mmj_1a2f9f3b-603b-4004-8e6f-dce5b810785c/horizon/0.log" Dec 01 16:02:02 crc kubenswrapper[4931]: I1201 16:02:02.402975 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-z54c5_624618ca-ac0b-4fcf-bcf8-a7e744c98241/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:02 crc kubenswrapper[4931]: I1201 16:02:02.497587 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-65c944c654-l6mmj_1a2f9f3b-603b-4004-8e6f-dce5b810785c/horizon-log/0.log" Dec 01 16:02:02 crc kubenswrapper[4931]: I1201 16:02:02.529061 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-476jq_c09d1f20-7083-4a46-bb55-734481a5d66c/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:02 crc kubenswrapper[4931]: I1201 16:02:02.758537 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29410081-j4xz2_736a9048-9855-44b1-aae0-9da840848c45/keystone-cron/0.log" Dec 01 16:02:02 crc kubenswrapper[4931]: I1201 16:02:02.797358 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7964d85c7c-w2fwr_407f9337-3fab-42cc-b10d-eada296e7919/keystone-api/0.log" Dec 01 16:02:02 crc kubenswrapper[4931]: I1201 16:02:02.897365 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7/kube-state-metrics/0.log" Dec 01 16:02:03 crc kubenswrapper[4931]: I1201 16:02:03.124606 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vrghr_c016013e-7315-4763-9b4d-0876e4c2068f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:03 crc kubenswrapper[4931]: I1201 16:02:03.433835 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f86b4c4b5-zs55r_38afe8fe-5f17-4be2-84b6-da4211785ed1/neutron-httpd/0.log" Dec 01 16:02:03 crc kubenswrapper[4931]: I1201 16:02:03.537074 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f86b4c4b5-zs55r_38afe8fe-5f17-4be2-84b6-da4211785ed1/neutron-api/0.log" Dec 01 16:02:03 crc kubenswrapper[4931]: I1201 16:02:03.649828 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv_6225489b-6b7e-40c4-9f3e-e2e28b74d274/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:04 crc kubenswrapper[4931]: I1201 16:02:04.088454 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c084c37d-132d-466d-94d5-8176928d467e/nova-cell0-conductor-conductor/0.log" Dec 01 16:02:04 crc kubenswrapper[4931]: I1201 16:02:04.090838 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2410dbb5-8347-4484-b4dc-6ed9dd32edf7/nova-api-log/0.log" Dec 01 16:02:04 crc kubenswrapper[4931]: I1201 16:02:04.288021 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2410dbb5-8347-4484-b4dc-6ed9dd32edf7/nova-api-api/0.log" Dec 01 16:02:04 crc kubenswrapper[4931]: I1201 16:02:04.395188 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_dfb9b8d1-45eb-45c7-af4e-64c6c020860d/nova-cell1-conductor-conductor/0.log" Dec 01 16:02:04 crc kubenswrapper[4931]: I1201 16:02:04.398469 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a5e7833d-55a2-4896-9ca7-610c68157f00/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 16:02:04 crc kubenswrapper[4931]: I1201 16:02:04.591583 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-qbgns_90baac5e-c041-4bd1-bba8-b11e708370e7/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:04 crc kubenswrapper[4931]: I1201 16:02:04.706076 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db/nova-metadata-log/0.log" Dec 01 16:02:05 crc kubenswrapper[4931]: I1201 16:02:05.026718 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cf2acf20-9353-4a11-a09c-3d455a247303/mysql-bootstrap/0.log" Dec 01 16:02:05 crc kubenswrapper[4931]: I1201 16:02:05.030679 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_951e2313-b009-42b4-8d52-20a4e3ad6dbf/nova-scheduler-scheduler/0.log" Dec 01 16:02:05 crc kubenswrapper[4931]: I1201 16:02:05.179352 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cf2acf20-9353-4a11-a09c-3d455a247303/mysql-bootstrap/0.log" Dec 01 16:02:05 crc kubenswrapper[4931]: I1201 16:02:05.211111 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cf2acf20-9353-4a11-a09c-3d455a247303/galera/0.log" Dec 01 16:02:05 crc kubenswrapper[4931]: I1201 16:02:05.361110 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa/mysql-bootstrap/0.log" Dec 01 16:02:05 crc kubenswrapper[4931]: I1201 16:02:05.523782 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa/mysql-bootstrap/0.log" Dec 01 16:02:05 crc kubenswrapper[4931]: I1201 16:02:05.546890 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa/galera/0.log" Dec 01 16:02:05 crc kubenswrapper[4931]: I1201 16:02:05.845999 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_0ca8ee92-e191-4e9b-aa91-af27342a9fb5/openstackclient/0.log" Dec 01 16:02:05 crc kubenswrapper[4931]: I1201 16:02:05.886145 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xhksj_0476eb16-ff0d-476b-b6da-dd437e123f26/openstack-network-exporter/0.log" Dec 01 16:02:05 crc kubenswrapper[4931]: I1201 16:02:05.943507 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db/nova-metadata-metadata/0.log" Dec 01 16:02:06 crc kubenswrapper[4931]: I1201 16:02:06.111237 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cgg9p_a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24/ovsdb-server-init/0.log" Dec 01 16:02:06 crc kubenswrapper[4931]: I1201 16:02:06.262222 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cgg9p_a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24/ovsdb-server-init/0.log" Dec 01 16:02:06 crc kubenswrapper[4931]: I1201 16:02:06.301889 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cgg9p_a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24/ovsdb-server/0.log" Dec 01 16:02:06 crc kubenswrapper[4931]: I1201 16:02:06.304942 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cgg9p_a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24/ovs-vswitchd/0.log" Dec 01 16:02:06 crc kubenswrapper[4931]: I1201 16:02:06.452514 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-v8h85_6f943374-baa7-4200-93ff-6773c58b032d/ovn-controller/0.log" Dec 01 16:02:06 crc kubenswrapper[4931]: I1201 16:02:06.514374 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-btzw5_249f41eb-85d7-46f2-80d3-5f1ea0dcbda7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:06 crc kubenswrapper[4931]: I1201 16:02:06.661959 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47/openstack-network-exporter/0.log" Dec 01 16:02:06 crc kubenswrapper[4931]: I1201 16:02:06.742998 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47/ovn-northd/0.log" Dec 01 16:02:06 crc kubenswrapper[4931]: I1201 16:02:06.872197 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d08b55bc-d817-44c5-81f3-6581236b50c1/ovsdbserver-nb/0.log" Dec 01 16:02:06 crc kubenswrapper[4931]: I1201 16:02:06.921241 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d08b55bc-d817-44c5-81f3-6581236b50c1/openstack-network-exporter/0.log" Dec 01 16:02:07 crc kubenswrapper[4931]: I1201 16:02:07.043999 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4f5fc34b-28c1-4e76-8de7-aa52db803802/ovsdbserver-sb/0.log" Dec 01 16:02:07 crc kubenswrapper[4931]: I1201 16:02:07.086539 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4f5fc34b-28c1-4e76-8de7-aa52db803802/openstack-network-exporter/0.log" Dec 01 16:02:07 crc kubenswrapper[4931]: I1201 16:02:07.307441 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c9d84d99b-fj4vs_ae11272a-fb06-4ea3-8ab4-64a667d9cdd9/placement-api/0.log" Dec 01 16:02:07 crc kubenswrapper[4931]: I1201 16:02:07.365132 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c9d84d99b-fj4vs_ae11272a-fb06-4ea3-8ab4-64a667d9cdd9/placement-log/0.log" Dec 01 16:02:07 crc kubenswrapper[4931]: I1201 16:02:07.409912 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b364f6e4-552e-435c-b684-d6ebbc851ef2/setup-container/0.log" Dec 01 16:02:07 crc kubenswrapper[4931]: I1201 16:02:07.620695 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bc2ff309-c81a-4d19-bfb0-99a4a975b70a/setup-container/0.log" Dec 01 16:02:07 crc kubenswrapper[4931]: I1201 16:02:07.645187 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b364f6e4-552e-435c-b684-d6ebbc851ef2/setup-container/0.log" Dec 01 16:02:07 crc kubenswrapper[4931]: I1201 16:02:07.661404 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b364f6e4-552e-435c-b684-d6ebbc851ef2/rabbitmq/0.log" Dec 01 16:02:07 crc kubenswrapper[4931]: I1201 16:02:07.870349 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bc2ff309-c81a-4d19-bfb0-99a4a975b70a/setup-container/0.log" Dec 01 16:02:07 crc kubenswrapper[4931]: I1201 16:02:07.922015 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4_616e2b9d-1da1-44fd-98c3-7f8cbc8d686c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:07 crc kubenswrapper[4931]: I1201 16:02:07.966914 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bc2ff309-c81a-4d19-bfb0-99a4a975b70a/rabbitmq/0.log" Dec 01 16:02:08 crc kubenswrapper[4931]: I1201 16:02:08.152633 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-wx9bv_000355db-6a0a-46ab-8e8e-1040c775de8a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:08 crc kubenswrapper[4931]: I1201 16:02:08.189517 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj_c44015a7-22fa-4746-8536-2f7c70888a5d/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:08 crc kubenswrapper[4931]: I1201 16:02:08.345128 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-kqxsm_def3b4c2-cbcc-4aae-a750-6c96cd0d8f67/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:08 crc kubenswrapper[4931]: I1201 16:02:08.424529 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-f4bm9_04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc/ssh-known-hosts-edpm-deployment/0.log" Dec 01 16:02:08 crc kubenswrapper[4931]: I1201 16:02:08.689293 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7465544595-sc668_2b1d2c6e-39e6-438c-98e8-be76bfa71050/proxy-server/0.log" Dec 01 16:02:08 crc kubenswrapper[4931]: I1201 16:02:08.749675 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7465544595-sc668_2b1d2c6e-39e6-438c-98e8-be76bfa71050/proxy-httpd/0.log" Dec 01 16:02:08 crc kubenswrapper[4931]: I1201 16:02:08.754893 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-lbvg4_ec217c21-8698-4dd3-a58d-b2626cbfbaf2/swift-ring-rebalance/0.log" Dec 01 16:02:08 crc kubenswrapper[4931]: I1201 16:02:08.895378 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/account-auditor/0.log" Dec 01 16:02:08 crc kubenswrapper[4931]: I1201 16:02:08.934342 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/account-reaper/0.log" Dec 01 16:02:08 crc kubenswrapper[4931]: I1201 16:02:08.995183 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/account-replicator/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.117644 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/container-auditor/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.134839 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/account-server/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.178010 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/container-replicator/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.259228 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/container-server/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.319996 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/container-updater/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.338440 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/object-auditor/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.394850 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/object-expirer/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.476990 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/object-replicator/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.521840 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/object-server/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.561428 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/object-updater/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.619412 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/rsync/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.686649 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/swift-recon-cron/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.833265 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-2xthl_d89048c3-b9a3-4274-8d12-9543d8a29503/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:09 crc kubenswrapper[4931]: I1201 16:02:09.925022 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e384c534-76cd-4296-9318-aaf007e87661/tempest-tests-tempest-tests-runner/0.log" Dec 01 16:02:10 crc kubenswrapper[4931]: I1201 16:02:10.028012 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b0a9c963-758f-4d4d-a9c0-e148a5733bf9/test-operator-logs-container/0.log" Dec 01 16:02:10 crc kubenswrapper[4931]: I1201 16:02:10.191578 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8_e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:02:19 crc kubenswrapper[4931]: I1201 16:02:19.648936 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_474458a3-5f29-4735-bed5-96f2f1d6e352/memcached/0.log" Dec 01 16:02:34 crc kubenswrapper[4931]: I1201 16:02:34.852606 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/util/0.log" Dec 01 16:02:35 crc kubenswrapper[4931]: I1201 16:02:35.099882 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/util/0.log" Dec 01 16:02:35 crc kubenswrapper[4931]: I1201 16:02:35.123675 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/pull/0.log" Dec 01 16:02:35 crc kubenswrapper[4931]: I1201 16:02:35.132249 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/pull/0.log" Dec 01 16:02:35 crc kubenswrapper[4931]: I1201 16:02:35.287170 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/pull/0.log" Dec 01 16:02:35 crc kubenswrapper[4931]: I1201 16:02:35.335412 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/util/0.log" Dec 01 16:02:35 crc kubenswrapper[4931]: I1201 16:02:35.335673 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/extract/0.log" Dec 01 16:02:35 crc kubenswrapper[4931]: I1201 16:02:35.528308 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-wcntp_71d6d824-1cf3-4984-b8de-f10d19192a5f/kube-rbac-proxy/0.log" Dec 01 16:02:35 crc kubenswrapper[4931]: I1201 16:02:35.583409 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-wcntp_71d6d824-1cf3-4984-b8de-f10d19192a5f/manager/0.log" Dec 01 16:02:35 crc kubenswrapper[4931]: I1201 16:02:35.592772 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-lr4mz_fff7f112-af9f-42e3-beef-e0efdcf602c9/kube-rbac-proxy/0.log" Dec 01 16:02:35 crc kubenswrapper[4931]: I1201 16:02:35.748034 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-lr4mz_fff7f112-af9f-42e3-beef-e0efdcf602c9/manager/0.log" Dec 01 16:02:35 crc kubenswrapper[4931]: I1201 16:02:35.792282 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-g2jqf_8258a972-ead1-4bee-ae4f-cba90b238dde/kube-rbac-proxy/0.log" Dec 01 16:02:35 crc kubenswrapper[4931]: I1201 16:02:35.804479 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-g2jqf_8258a972-ead1-4bee-ae4f-cba90b238dde/manager/0.log" Dec 01 16:02:35 crc kubenswrapper[4931]: I1201 16:02:35.960581 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-g8vfm_006dbf05-46c7-4348-a9bb-74a7c56fd3fd/kube-rbac-proxy/0.log" Dec 01 16:02:36 crc kubenswrapper[4931]: I1201 16:02:36.109700 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-g8vfm_006dbf05-46c7-4348-a9bb-74a7c56fd3fd/manager/0.log" Dec 01 16:02:36 crc kubenswrapper[4931]: I1201 16:02:36.155304 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-2fp96_921c2e7b-0f37-4e93-ab2e-76a23e146d28/kube-rbac-proxy/0.log" Dec 01 16:02:36 crc kubenswrapper[4931]: I1201 16:02:36.185122 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-2fp96_921c2e7b-0f37-4e93-ab2e-76a23e146d28/manager/0.log" Dec 01 16:02:36 crc kubenswrapper[4931]: I1201 16:02:36.520001 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-6x5bb_6b36c7c7-1886-476e-b0d8-50168c04ff83/kube-rbac-proxy/0.log" Dec 01 16:02:36 crc kubenswrapper[4931]: I1201 16:02:36.531045 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-6x5bb_6b36c7c7-1886-476e-b0d8-50168c04ff83/manager/0.log" Dec 01 16:02:36 crc kubenswrapper[4931]: I1201 16:02:36.612115 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-sk9bq_371eef0f-aae7-40bd-9c47-1ffd0e77e08d/kube-rbac-proxy/0.log" Dec 01 16:02:36 crc kubenswrapper[4931]: I1201 16:02:36.822316 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-gzpgd_e20c64cb-88d5-4ffe-bb88-8715010ccf33/kube-rbac-proxy/0.log" Dec 01 16:02:36 crc kubenswrapper[4931]: I1201 16:02:36.869522 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-gzpgd_e20c64cb-88d5-4ffe-bb88-8715010ccf33/manager/0.log" Dec 01 16:02:36 crc kubenswrapper[4931]: I1201 16:02:36.894454 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-sk9bq_371eef0f-aae7-40bd-9c47-1ffd0e77e08d/manager/0.log" Dec 01 16:02:37 crc kubenswrapper[4931]: I1201 16:02:37.046630 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-l24dv_56ca76b6-8e16-4d73-9e78-f20e046738fc/kube-rbac-proxy/0.log" Dec 01 16:02:37 crc kubenswrapper[4931]: I1201 16:02:37.128631 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-l24dv_56ca76b6-8e16-4d73-9e78-f20e046738fc/manager/0.log" Dec 01 16:02:37 crc kubenswrapper[4931]: I1201 16:02:37.152165 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-bpqmb_1c0a3def-dea0-40f3-8368-a36f6030f7f7/kube-rbac-proxy/0.log" Dec 01 16:02:37 crc kubenswrapper[4931]: I1201 16:02:37.278832 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-bpqmb_1c0a3def-dea0-40f3-8368-a36f6030f7f7/manager/0.log" Dec 01 16:02:37 crc kubenswrapper[4931]: I1201 16:02:37.353026 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ms6t7_b30439fb-2d71-4c3b-97ec-5e304c1eb15e/manager/0.log" Dec 01 16:02:37 crc kubenswrapper[4931]: I1201 16:02:37.371853 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ms6t7_b30439fb-2d71-4c3b-97ec-5e304c1eb15e/kube-rbac-proxy/0.log" Dec 01 16:02:37 crc kubenswrapper[4931]: I1201 16:02:37.554117 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-jvkpz_e32d4db7-aa36-4724-a113-2f7ff2af254d/kube-rbac-proxy/0.log" Dec 01 16:02:37 crc kubenswrapper[4931]: I1201 16:02:37.555779 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-jvkpz_e32d4db7-aa36-4724-a113-2f7ff2af254d/manager/0.log" Dec 01 16:02:37 crc kubenswrapper[4931]: I1201 16:02:37.685569 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-h5c6g_3dc1d698-63f8-4e4b-8e28-6e128b5b46da/kube-rbac-proxy/0.log" Dec 01 16:02:37 crc kubenswrapper[4931]: I1201 16:02:37.795327 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mtpcz_e45f587b-3120-4b16-9c9a-66bc5c1252aa/kube-rbac-proxy/0.log" Dec 01 16:02:37 crc kubenswrapper[4931]: I1201 16:02:37.850451 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-h5c6g_3dc1d698-63f8-4e4b-8e28-6e128b5b46da/manager/0.log" Dec 01 16:02:37 crc kubenswrapper[4931]: I1201 16:02:37.933573 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mtpcz_e45f587b-3120-4b16-9c9a-66bc5c1252aa/manager/0.log" Dec 01 16:02:38 crc kubenswrapper[4931]: I1201 16:02:38.021748 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc_b366d143-b330-4141-be76-b87796b94301/kube-rbac-proxy/0.log" Dec 01 16:02:38 crc kubenswrapper[4931]: I1201 16:02:38.033994 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc_b366d143-b330-4141-be76-b87796b94301/manager/0.log" Dec 01 16:02:38 crc kubenswrapper[4931]: I1201 16:02:38.480358 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-72slc_6fff8f62-0a00-45e5-9e66-bd92dff14023/registry-server/0.log" Dec 01 16:02:38 crc kubenswrapper[4931]: I1201 16:02:38.509850 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-654ffbd64b-qsbsr_a34fc488-c895-4f50-9164-ced702fcf61d/operator/0.log" Dec 01 16:02:38 crc kubenswrapper[4931]: I1201 16:02:38.719148 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-g9s8h_31166fda-e2fe-4a4a-9717-550172ed4093/kube-rbac-proxy/0.log" Dec 01 16:02:38 crc kubenswrapper[4931]: I1201 16:02:38.764583 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-g9s8h_31166fda-e2fe-4a4a-9717-550172ed4093/manager/0.log" Dec 01 16:02:38 crc kubenswrapper[4931]: I1201 16:02:38.841697 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-n4w4t_a35a63a2-123b-45f4-99fe-4a7baece61be/kube-rbac-proxy/0.log" Dec 01 16:02:38 crc kubenswrapper[4931]: I1201 16:02:38.993832 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-n4w4t_a35a63a2-123b-45f4-99fe-4a7baece61be/manager/0.log" Dec 01 16:02:39 crc kubenswrapper[4931]: I1201 16:02:39.008891 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-zbd9m_cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3/operator/0.log" Dec 01 16:02:39 crc kubenswrapper[4931]: I1201 16:02:39.230653 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-vv2tp_12e80a83-7d9f-417b-934e-83d23085f11b/kube-rbac-proxy/0.log" Dec 01 16:02:39 crc kubenswrapper[4931]: I1201 16:02:39.256557 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-vv2tp_12e80a83-7d9f-417b-934e-83d23085f11b/manager/0.log" Dec 01 16:02:39 crc kubenswrapper[4931]: I1201 16:02:39.293126 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-95d9848f7-bjcxf_803870f9-7602-4eae-ba61-09e7aa4c63bb/manager/0.log" Dec 01 16:02:39 crc kubenswrapper[4931]: I1201 16:02:39.299303 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-rv87n_39f1253b-afe7-47b1-8c68-2a36d49f969b/kube-rbac-proxy/0.log" Dec 01 16:02:39 crc kubenswrapper[4931]: I1201 16:02:39.500619 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-lsxmf_75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae/kube-rbac-proxy/0.log" Dec 01 16:02:39 crc kubenswrapper[4931]: I1201 16:02:39.511347 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-rv87n_39f1253b-afe7-47b1-8c68-2a36d49f969b/manager/0.log" Dec 01 16:02:39 crc kubenswrapper[4931]: I1201 16:02:39.520952 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-lsxmf_75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae/manager/0.log" Dec 01 16:02:39 crc kubenswrapper[4931]: I1201 16:02:39.676090 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-6kbkb_18b9fb30-a34f-42ad-9692-84c532f586d6/kube-rbac-proxy/0.log" Dec 01 16:02:39 crc kubenswrapper[4931]: I1201 16:02:39.710447 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-6kbkb_18b9fb30-a34f-42ad-9692-84c532f586d6/manager/0.log" Dec 01 16:02:58 crc kubenswrapper[4931]: I1201 16:02:58.932410 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nx8gh_0216ff96-a3b4-4486-91ab-f73485d18134/control-plane-machine-set-operator/0.log" Dec 01 16:02:59 crc kubenswrapper[4931]: I1201 16:02:59.087600 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mbql2_0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7/kube-rbac-proxy/0.log" Dec 01 16:02:59 crc kubenswrapper[4931]: I1201 16:02:59.095516 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mbql2_0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7/machine-api-operator/0.log" Dec 01 16:03:13 crc kubenswrapper[4931]: I1201 16:03:13.097166 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-lgwml_6057bc63-1367-4776-b9ab-2750c34f017d/cert-manager-controller/0.log" Dec 01 16:03:13 crc kubenswrapper[4931]: I1201 16:03:13.284695 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-6jglv_c0dce0ee-05c3-42a7-a599-5bff0b0416ee/cert-manager-webhook/0.log" Dec 01 16:03:13 crc kubenswrapper[4931]: I1201 16:03:13.293176 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-d8s9l_75a5d47d-0c63-48dc-b439-8b69b82c29ed/cert-manager-cainjector/0.log" Dec 01 16:03:27 crc kubenswrapper[4931]: I1201 16:03:27.030905 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-lt2nw_a5b7cbc9-6afb-414c-ac6e-569b6d9634ed/nmstate-console-plugin/0.log" Dec 01 16:03:27 crc kubenswrapper[4931]: I1201 16:03:27.251115 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8ljjr_fd514864-e79c-4cd7-9517-0b3f9fbee078/nmstate-handler/0.log" Dec 01 16:03:27 crc kubenswrapper[4931]: I1201 16:03:27.399362 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-4xwr7_8be2b09a-fed1-4f7f-9424-5ca00a814c3d/nmstate-metrics/0.log" Dec 01 16:03:27 crc kubenswrapper[4931]: I1201 16:03:27.418975 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-4xwr7_8be2b09a-fed1-4f7f-9424-5ca00a814c3d/kube-rbac-proxy/0.log" Dec 01 16:03:27 crc kubenswrapper[4931]: I1201 16:03:27.580222 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-h5j95_915c495c-c7fa-4c00-ad7e-03d0e7ba74c9/nmstate-operator/0.log" Dec 01 16:03:27 crc kubenswrapper[4931]: I1201 16:03:27.615167 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-5xlg8_f821ad14-f34f-49cb-a884-905cf0219454/nmstate-webhook/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.158061 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-jcr8j_d143b724-552d-4530-844e-c8b752b2ffa3/kube-rbac-proxy/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.226327 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-jcr8j_d143b724-552d-4530-844e-c8b752b2ffa3/controller/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.328599 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-frr-files/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.524329 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-frr-files/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.550638 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-reloader/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.554737 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-metrics/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.560843 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-reloader/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.683163 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-frr-files/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.727451 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-reloader/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.733883 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-metrics/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.755179 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-metrics/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.908040 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-frr-files/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.914066 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-metrics/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.918187 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-reloader/0.log" Dec 01 16:03:43 crc kubenswrapper[4931]: I1201 16:03:43.941339 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/controller/0.log" Dec 01 16:03:44 crc kubenswrapper[4931]: I1201 16:03:44.069319 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/frr-metrics/0.log" Dec 01 16:03:44 crc kubenswrapper[4931]: I1201 16:03:44.099348 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/kube-rbac-proxy/0.log" Dec 01 16:03:44 crc kubenswrapper[4931]: I1201 16:03:44.139892 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/kube-rbac-proxy-frr/0.log" Dec 01 16:03:44 crc kubenswrapper[4931]: I1201 16:03:44.348517 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/reloader/0.log" Dec 01 16:03:44 crc kubenswrapper[4931]: I1201 16:03:44.351909 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-rj9kg_84cf14c6-6a50-4863-98a4-caab8b1f5636/frr-k8s-webhook-server/0.log" Dec 01 16:03:44 crc kubenswrapper[4931]: I1201 16:03:44.598198 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-549c5b5d67-wrmzh_14764be0-1d4c-47d3-9d5e-3682a7857d04/manager/0.log" Dec 01 16:03:44 crc kubenswrapper[4931]: I1201 16:03:44.720272 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-dbbf79d98-cbdx9_63d2099f-ba55-4f55-97ac-bca52404a30a/webhook-server/0.log" Dec 01 16:03:44 crc kubenswrapper[4931]: I1201 16:03:44.836519 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qb2rl_d427cc71-929e-42ae-bc96-87360ba8c005/kube-rbac-proxy/0.log" Dec 01 16:03:45 crc kubenswrapper[4931]: I1201 16:03:45.408125 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qb2rl_d427cc71-929e-42ae-bc96-87360ba8c005/speaker/0.log" Dec 01 16:03:45 crc kubenswrapper[4931]: I1201 16:03:45.524602 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/frr/0.log" Dec 01 16:03:49 crc kubenswrapper[4931]: I1201 16:03:49.872174 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 16:03:49 crc kubenswrapper[4931]: I1201 16:03:49.872705 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 16:03:58 crc kubenswrapper[4931]: I1201 16:03:58.181062 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/util/0.log" Dec 01 16:03:58 crc kubenswrapper[4931]: I1201 16:03:58.364273 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/pull/0.log" Dec 01 16:03:58 crc kubenswrapper[4931]: I1201 16:03:58.378672 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/util/0.log" Dec 01 16:03:58 crc kubenswrapper[4931]: I1201 16:03:58.399560 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/pull/0.log" Dec 01 16:03:58 crc kubenswrapper[4931]: I1201 16:03:58.592798 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/util/0.log" Dec 01 16:03:58 crc kubenswrapper[4931]: I1201 16:03:58.600549 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/pull/0.log" Dec 01 16:03:58 crc kubenswrapper[4931]: I1201 16:03:58.604211 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/extract/0.log" Dec 01 16:03:58 crc kubenswrapper[4931]: I1201 16:03:58.760803 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/util/0.log" Dec 01 16:03:58 crc kubenswrapper[4931]: I1201 16:03:58.927576 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/util/0.log" Dec 01 16:03:58 crc kubenswrapper[4931]: I1201 16:03:58.956613 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/pull/0.log" Dec 01 16:03:58 crc kubenswrapper[4931]: I1201 16:03:58.958609 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/pull/0.log" Dec 01 16:03:59 crc kubenswrapper[4931]: I1201 16:03:59.144307 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/pull/0.log" Dec 01 16:03:59 crc kubenswrapper[4931]: I1201 16:03:59.188378 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/util/0.log" Dec 01 16:03:59 crc kubenswrapper[4931]: I1201 16:03:59.208257 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/extract/0.log" Dec 01 16:03:59 crc kubenswrapper[4931]: I1201 16:03:59.363175 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/extract-utilities/0.log" Dec 01 16:04:00 crc kubenswrapper[4931]: I1201 16:04:00.106592 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/extract-utilities/0.log" Dec 01 16:04:00 crc kubenswrapper[4931]: I1201 16:04:00.119111 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/extract-content/0.log" Dec 01 16:04:00 crc kubenswrapper[4931]: I1201 16:04:00.145161 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/extract-content/0.log" Dec 01 16:04:00 crc kubenswrapper[4931]: I1201 16:04:00.277850 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/extract-content/0.log" Dec 01 16:04:00 crc kubenswrapper[4931]: I1201 16:04:00.293158 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/extract-utilities/0.log" Dec 01 16:04:00 crc kubenswrapper[4931]: I1201 16:04:00.410408 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/registry-server/0.log" Dec 01 16:04:00 crc kubenswrapper[4931]: I1201 16:04:00.444350 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/extract-utilities/0.log" Dec 01 16:04:00 crc kubenswrapper[4931]: I1201 16:04:00.603358 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/extract-utilities/0.log" Dec 01 16:04:00 crc kubenswrapper[4931]: I1201 16:04:00.641869 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/extract-content/0.log" Dec 01 16:04:00 crc kubenswrapper[4931]: I1201 16:04:00.653466 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/extract-content/0.log" Dec 01 16:04:00 crc kubenswrapper[4931]: I1201 16:04:00.805692 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/extract-content/0.log" Dec 01 16:04:00 crc kubenswrapper[4931]: I1201 16:04:00.821837 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/extract-utilities/0.log" Dec 01 16:04:01 crc kubenswrapper[4931]: I1201 16:04:01.025806 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8tfrh_d87b0f89-7ea5-4550-bace-1df5f7c508db/marketplace-operator/0.log" Dec 01 16:04:01 crc kubenswrapper[4931]: I1201 16:04:01.097140 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/extract-utilities/0.log" Dec 01 16:04:01 crc kubenswrapper[4931]: I1201 16:04:01.541023 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/registry-server/0.log" Dec 01 16:04:01 crc kubenswrapper[4931]: I1201 16:04:01.755923 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/extract-utilities/0.log" Dec 01 16:04:01 crc kubenswrapper[4931]: I1201 16:04:01.769547 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/extract-content/0.log" Dec 01 16:04:01 crc kubenswrapper[4931]: I1201 16:04:01.786821 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/extract-content/0.log" Dec 01 16:04:02 crc kubenswrapper[4931]: I1201 16:04:02.041968 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wxd5j_c3575355-a4fc-4f78-8512-79f2fb4bd449/extract-utilities/0.log" Dec 01 16:04:02 crc kubenswrapper[4931]: I1201 16:04:02.046701 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/extract-utilities/0.log" Dec 01 16:04:02 crc kubenswrapper[4931]: I1201 16:04:02.047965 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/extract-content/0.log" Dec 01 16:04:02 crc kubenswrapper[4931]: I1201 16:04:02.049705 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/registry-server/0.log" Dec 01 16:04:02 crc kubenswrapper[4931]: I1201 16:04:02.207399 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wxd5j_c3575355-a4fc-4f78-8512-79f2fb4bd449/extract-content/0.log" Dec 01 16:04:02 crc kubenswrapper[4931]: I1201 16:04:02.209952 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wxd5j_c3575355-a4fc-4f78-8512-79f2fb4bd449/extract-utilities/0.log" Dec 01 16:04:02 crc kubenswrapper[4931]: I1201 16:04:02.232884 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wxd5j_c3575355-a4fc-4f78-8512-79f2fb4bd449/extract-content/0.log" Dec 01 16:04:02 crc kubenswrapper[4931]: I1201 16:04:02.423594 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wxd5j_c3575355-a4fc-4f78-8512-79f2fb4bd449/extract-utilities/0.log" Dec 01 16:04:02 crc kubenswrapper[4931]: I1201 16:04:02.460313 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wxd5j_c3575355-a4fc-4f78-8512-79f2fb4bd449/extract-content/0.log" Dec 01 16:04:02 crc kubenswrapper[4931]: I1201 16:04:02.952865 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wxd5j_c3575355-a4fc-4f78-8512-79f2fb4bd449/registry-server/0.log" Dec 01 16:04:19 crc kubenswrapper[4931]: I1201 16:04:19.872297 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 16:04:19 crc kubenswrapper[4931]: I1201 16:04:19.872941 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 16:04:49 crc kubenswrapper[4931]: I1201 16:04:49.871969 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 16:04:49 crc kubenswrapper[4931]: I1201 16:04:49.872623 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 16:04:49 crc kubenswrapper[4931]: I1201 16:04:49.872683 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 16:04:49 crc kubenswrapper[4931]: I1201 16:04:49.873712 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 16:04:49 crc kubenswrapper[4931]: I1201 16:04:49.873803 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" gracePeriod=600 Dec 01 16:04:50 crc kubenswrapper[4931]: E1201 16:04:50.021063 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:04:50 crc kubenswrapper[4931]: I1201 16:04:50.994962 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" exitCode=0 Dec 01 16:04:50 crc kubenswrapper[4931]: I1201 16:04:50.995039 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc"} Dec 01 16:04:50 crc kubenswrapper[4931]: I1201 16:04:50.995097 4931 scope.go:117] "RemoveContainer" containerID="d7a9470ce8c868229b979f2cb9e7efbcbadfca1c6e29e938ab705e714f043b38" Dec 01 16:04:50 crc kubenswrapper[4931]: I1201 16:04:50.995887 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:04:50 crc kubenswrapper[4931]: E1201 16:04:50.996186 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.530894 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n85b4"] Dec 01 16:04:58 crc kubenswrapper[4931]: E1201 16:04:58.531860 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e0e29b-12ee-4938-9b6f-b9c2389a95fe" containerName="container-00" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.531874 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e0e29b-12ee-4938-9b6f-b9c2389a95fe" containerName="container-00" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.532089 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e0e29b-12ee-4938-9b6f-b9c2389a95fe" containerName="container-00" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.533434 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.541641 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n85b4"] Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.666475 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be91a404-d2e8-463a-9aa4-34351d6c67a8-catalog-content\") pod \"redhat-operators-n85b4\" (UID: \"be91a404-d2e8-463a-9aa4-34351d6c67a8\") " pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.666551 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be91a404-d2e8-463a-9aa4-34351d6c67a8-utilities\") pod \"redhat-operators-n85b4\" (UID: \"be91a404-d2e8-463a-9aa4-34351d6c67a8\") " pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.666656 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7v78\" (UniqueName: \"kubernetes.io/projected/be91a404-d2e8-463a-9aa4-34351d6c67a8-kube-api-access-r7v78\") pod \"redhat-operators-n85b4\" (UID: \"be91a404-d2e8-463a-9aa4-34351d6c67a8\") " pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.768312 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be91a404-d2e8-463a-9aa4-34351d6c67a8-catalog-content\") pod \"redhat-operators-n85b4\" (UID: \"be91a404-d2e8-463a-9aa4-34351d6c67a8\") " pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.768368 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be91a404-d2e8-463a-9aa4-34351d6c67a8-utilities\") pod \"redhat-operators-n85b4\" (UID: \"be91a404-d2e8-463a-9aa4-34351d6c67a8\") " pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.768816 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7v78\" (UniqueName: \"kubernetes.io/projected/be91a404-d2e8-463a-9aa4-34351d6c67a8-kube-api-access-r7v78\") pod \"redhat-operators-n85b4\" (UID: \"be91a404-d2e8-463a-9aa4-34351d6c67a8\") " pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.768897 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be91a404-d2e8-463a-9aa4-34351d6c67a8-catalog-content\") pod \"redhat-operators-n85b4\" (UID: \"be91a404-d2e8-463a-9aa4-34351d6c67a8\") " pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.768939 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be91a404-d2e8-463a-9aa4-34351d6c67a8-utilities\") pod \"redhat-operators-n85b4\" (UID: \"be91a404-d2e8-463a-9aa4-34351d6c67a8\") " pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.789003 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7v78\" (UniqueName: \"kubernetes.io/projected/be91a404-d2e8-463a-9aa4-34351d6c67a8-kube-api-access-r7v78\") pod \"redhat-operators-n85b4\" (UID: \"be91a404-d2e8-463a-9aa4-34351d6c67a8\") " pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:04:58 crc kubenswrapper[4931]: I1201 16:04:58.863836 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:04:59 crc kubenswrapper[4931]: I1201 16:04:59.319396 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n85b4"] Dec 01 16:05:00 crc kubenswrapper[4931]: I1201 16:05:00.104496 4931 generic.go:334] "Generic (PLEG): container finished" podID="be91a404-d2e8-463a-9aa4-34351d6c67a8" containerID="b5ec51792f5953381e92585688c2b52cfae70753406c4e6dcb6f0fc4df72454c" exitCode=0 Dec 01 16:05:00 crc kubenswrapper[4931]: I1201 16:05:00.104554 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n85b4" event={"ID":"be91a404-d2e8-463a-9aa4-34351d6c67a8","Type":"ContainerDied","Data":"b5ec51792f5953381e92585688c2b52cfae70753406c4e6dcb6f0fc4df72454c"} Dec 01 16:05:00 crc kubenswrapper[4931]: I1201 16:05:00.104597 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n85b4" event={"ID":"be91a404-d2e8-463a-9aa4-34351d6c67a8","Type":"ContainerStarted","Data":"e401c78925b05f02c4fa811d4894f2e3836932b3dd0a5217f299e446d021a531"} Dec 01 16:05:02 crc kubenswrapper[4931]: I1201 16:05:02.243339 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:05:02 crc kubenswrapper[4931]: E1201 16:05:02.243934 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:05:02 crc kubenswrapper[4931]: I1201 16:05:02.925792 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2f67t"] Dec 01 16:05:02 crc kubenswrapper[4931]: I1201 16:05:02.928753 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:02 crc kubenswrapper[4931]: I1201 16:05:02.940401 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2f67t"] Dec 01 16:05:02 crc kubenswrapper[4931]: I1201 16:05:02.950222 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44057f8b-7f73-43ad-807b-0dcec437eee0-catalog-content\") pod \"redhat-marketplace-2f67t\" (UID: \"44057f8b-7f73-43ad-807b-0dcec437eee0\") " pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:02 crc kubenswrapper[4931]: I1201 16:05:02.950607 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44057f8b-7f73-43ad-807b-0dcec437eee0-utilities\") pod \"redhat-marketplace-2f67t\" (UID: \"44057f8b-7f73-43ad-807b-0dcec437eee0\") " pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:02 crc kubenswrapper[4931]: I1201 16:05:02.950780 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbgp9\" (UniqueName: \"kubernetes.io/projected/44057f8b-7f73-43ad-807b-0dcec437eee0-kube-api-access-pbgp9\") pod \"redhat-marketplace-2f67t\" (UID: \"44057f8b-7f73-43ad-807b-0dcec437eee0\") " pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:03 crc kubenswrapper[4931]: I1201 16:05:03.053242 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbgp9\" (UniqueName: \"kubernetes.io/projected/44057f8b-7f73-43ad-807b-0dcec437eee0-kube-api-access-pbgp9\") pod \"redhat-marketplace-2f67t\" (UID: \"44057f8b-7f73-43ad-807b-0dcec437eee0\") " pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:03 crc kubenswrapper[4931]: I1201 16:05:03.053400 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44057f8b-7f73-43ad-807b-0dcec437eee0-catalog-content\") pod \"redhat-marketplace-2f67t\" (UID: \"44057f8b-7f73-43ad-807b-0dcec437eee0\") " pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:03 crc kubenswrapper[4931]: I1201 16:05:03.053446 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44057f8b-7f73-43ad-807b-0dcec437eee0-utilities\") pod \"redhat-marketplace-2f67t\" (UID: \"44057f8b-7f73-43ad-807b-0dcec437eee0\") " pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:03 crc kubenswrapper[4931]: I1201 16:05:03.054000 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44057f8b-7f73-43ad-807b-0dcec437eee0-utilities\") pod \"redhat-marketplace-2f67t\" (UID: \"44057f8b-7f73-43ad-807b-0dcec437eee0\") " pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:03 crc kubenswrapper[4931]: I1201 16:05:03.054177 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44057f8b-7f73-43ad-807b-0dcec437eee0-catalog-content\") pod \"redhat-marketplace-2f67t\" (UID: \"44057f8b-7f73-43ad-807b-0dcec437eee0\") " pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:03 crc kubenswrapper[4931]: I1201 16:05:03.095237 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbgp9\" (UniqueName: \"kubernetes.io/projected/44057f8b-7f73-43ad-807b-0dcec437eee0-kube-api-access-pbgp9\") pod \"redhat-marketplace-2f67t\" (UID: \"44057f8b-7f73-43ad-807b-0dcec437eee0\") " pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:03 crc kubenswrapper[4931]: I1201 16:05:03.300879 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:03 crc kubenswrapper[4931]: I1201 16:05:03.837343 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2f67t"] Dec 01 16:05:03 crc kubenswrapper[4931]: W1201 16:05:03.847711 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44057f8b_7f73_43ad_807b_0dcec437eee0.slice/crio-24bb131ea51db3b5efd2c4c4c70ff00d4a54f19df861b36a1144f1ea61d403ab WatchSource:0}: Error finding container 24bb131ea51db3b5efd2c4c4c70ff00d4a54f19df861b36a1144f1ea61d403ab: Status 404 returned error can't find the container with id 24bb131ea51db3b5efd2c4c4c70ff00d4a54f19df861b36a1144f1ea61d403ab Dec 01 16:05:04 crc kubenswrapper[4931]: I1201 16:05:04.144813 4931 generic.go:334] "Generic (PLEG): container finished" podID="44057f8b-7f73-43ad-807b-0dcec437eee0" containerID="8034dbfe13d55b23c67a88da4c017d5ee1b4adc3fc36bf296aa0643bb96f03b5" exitCode=0 Dec 01 16:05:04 crc kubenswrapper[4931]: I1201 16:05:04.144911 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2f67t" event={"ID":"44057f8b-7f73-43ad-807b-0dcec437eee0","Type":"ContainerDied","Data":"8034dbfe13d55b23c67a88da4c017d5ee1b4adc3fc36bf296aa0643bb96f03b5"} Dec 01 16:05:04 crc kubenswrapper[4931]: I1201 16:05:04.145199 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2f67t" event={"ID":"44057f8b-7f73-43ad-807b-0dcec437eee0","Type":"ContainerStarted","Data":"24bb131ea51db3b5efd2c4c4c70ff00d4a54f19df861b36a1144f1ea61d403ab"} Dec 01 16:05:09 crc kubenswrapper[4931]: I1201 16:05:09.199989 4931 generic.go:334] "Generic (PLEG): container finished" podID="44057f8b-7f73-43ad-807b-0dcec437eee0" containerID="0033a8e9deb15ea65067f1df726419a1e877591cf63efdfb0c8e898094023676" exitCode=0 Dec 01 16:05:09 crc kubenswrapper[4931]: I1201 16:05:09.200064 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2f67t" event={"ID":"44057f8b-7f73-43ad-807b-0dcec437eee0","Type":"ContainerDied","Data":"0033a8e9deb15ea65067f1df726419a1e877591cf63efdfb0c8e898094023676"} Dec 01 16:05:09 crc kubenswrapper[4931]: I1201 16:05:09.203300 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 16:05:09 crc kubenswrapper[4931]: I1201 16:05:09.206330 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n85b4" event={"ID":"be91a404-d2e8-463a-9aa4-34351d6c67a8","Type":"ContainerStarted","Data":"91209c32f62320eccde4855beef084b966827c93886f8a7eeb6f27c5e7137049"} Dec 01 16:05:12 crc kubenswrapper[4931]: I1201 16:05:12.234019 4931 generic.go:334] "Generic (PLEG): container finished" podID="be91a404-d2e8-463a-9aa4-34351d6c67a8" containerID="91209c32f62320eccde4855beef084b966827c93886f8a7eeb6f27c5e7137049" exitCode=0 Dec 01 16:05:12 crc kubenswrapper[4931]: I1201 16:05:12.235011 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n85b4" event={"ID":"be91a404-d2e8-463a-9aa4-34351d6c67a8","Type":"ContainerDied","Data":"91209c32f62320eccde4855beef084b966827c93886f8a7eeb6f27c5e7137049"} Dec 01 16:05:15 crc kubenswrapper[4931]: I1201 16:05:15.295856 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n85b4" event={"ID":"be91a404-d2e8-463a-9aa4-34351d6c67a8","Type":"ContainerStarted","Data":"7a1e16aec0859103d69eb4cbb7300e40e50cabc826540dcd0cfee54ea8c42954"} Dec 01 16:05:15 crc kubenswrapper[4931]: I1201 16:05:15.299213 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2f67t" event={"ID":"44057f8b-7f73-43ad-807b-0dcec437eee0","Type":"ContainerStarted","Data":"c1788994bbd5e14a660652525af1a0501d831fc4aa53acd532b5ba323f2de676"} Dec 01 16:05:15 crc kubenswrapper[4931]: I1201 16:05:15.343692 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n85b4" podStartSLOduration=2.9272443949999998 podStartE2EDuration="17.343669842s" podCreationTimestamp="2025-12-01 16:04:58 +0000 UTC" firstStartedPulling="2025-12-01 16:05:00.110061423 +0000 UTC m=+3846.535935130" lastFinishedPulling="2025-12-01 16:05:14.5264869 +0000 UTC m=+3860.952360577" observedRunningTime="2025-12-01 16:05:15.33792857 +0000 UTC m=+3861.763802237" watchObservedRunningTime="2025-12-01 16:05:15.343669842 +0000 UTC m=+3861.769543509" Dec 01 16:05:15 crc kubenswrapper[4931]: I1201 16:05:15.371475 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2f67t" podStartSLOduration=3.200716156 podStartE2EDuration="13.371457764s" podCreationTimestamp="2025-12-01 16:05:02 +0000 UTC" firstStartedPulling="2025-12-01 16:05:04.146427877 +0000 UTC m=+3850.572301544" lastFinishedPulling="2025-12-01 16:05:14.317169465 +0000 UTC m=+3860.743043152" observedRunningTime="2025-12-01 16:05:15.364097057 +0000 UTC m=+3861.789970734" watchObservedRunningTime="2025-12-01 16:05:15.371457764 +0000 UTC m=+3861.797331431" Dec 01 16:05:16 crc kubenswrapper[4931]: I1201 16:05:16.242431 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:05:16 crc kubenswrapper[4931]: E1201 16:05:16.242815 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:05:18 crc kubenswrapper[4931]: I1201 16:05:18.864370 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:05:18 crc kubenswrapper[4931]: I1201 16:05:18.865031 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:05:19 crc kubenswrapper[4931]: I1201 16:05:19.936773 4931 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n85b4" podUID="be91a404-d2e8-463a-9aa4-34351d6c67a8" containerName="registry-server" probeResult="failure" output=< Dec 01 16:05:19 crc kubenswrapper[4931]: timeout: failed to connect service ":50051" within 1s Dec 01 16:05:19 crc kubenswrapper[4931]: > Dec 01 16:05:23 crc kubenswrapper[4931]: I1201 16:05:23.302319 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:23 crc kubenswrapper[4931]: I1201 16:05:23.302722 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:23 crc kubenswrapper[4931]: I1201 16:05:23.364622 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:23 crc kubenswrapper[4931]: I1201 16:05:23.458444 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:23 crc kubenswrapper[4931]: I1201 16:05:23.629101 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2f67t"] Dec 01 16:05:25 crc kubenswrapper[4931]: I1201 16:05:25.411746 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2f67t" podUID="44057f8b-7f73-43ad-807b-0dcec437eee0" containerName="registry-server" containerID="cri-o://c1788994bbd5e14a660652525af1a0501d831fc4aa53acd532b5ba323f2de676" gracePeriod=2 Dec 01 16:05:25 crc kubenswrapper[4931]: I1201 16:05:25.913058 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:25 crc kubenswrapper[4931]: I1201 16:05:25.968679 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbgp9\" (UniqueName: \"kubernetes.io/projected/44057f8b-7f73-43ad-807b-0dcec437eee0-kube-api-access-pbgp9\") pod \"44057f8b-7f73-43ad-807b-0dcec437eee0\" (UID: \"44057f8b-7f73-43ad-807b-0dcec437eee0\") " Dec 01 16:05:25 crc kubenswrapper[4931]: I1201 16:05:25.968748 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44057f8b-7f73-43ad-807b-0dcec437eee0-utilities\") pod \"44057f8b-7f73-43ad-807b-0dcec437eee0\" (UID: \"44057f8b-7f73-43ad-807b-0dcec437eee0\") " Dec 01 16:05:25 crc kubenswrapper[4931]: I1201 16:05:25.968798 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44057f8b-7f73-43ad-807b-0dcec437eee0-catalog-content\") pod \"44057f8b-7f73-43ad-807b-0dcec437eee0\" (UID: \"44057f8b-7f73-43ad-807b-0dcec437eee0\") " Dec 01 16:05:25 crc kubenswrapper[4931]: I1201 16:05:25.970361 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44057f8b-7f73-43ad-807b-0dcec437eee0-utilities" (OuterVolumeSpecName: "utilities") pod "44057f8b-7f73-43ad-807b-0dcec437eee0" (UID: "44057f8b-7f73-43ad-807b-0dcec437eee0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:05:25 crc kubenswrapper[4931]: I1201 16:05:25.975469 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44057f8b-7f73-43ad-807b-0dcec437eee0-kube-api-access-pbgp9" (OuterVolumeSpecName: "kube-api-access-pbgp9") pod "44057f8b-7f73-43ad-807b-0dcec437eee0" (UID: "44057f8b-7f73-43ad-807b-0dcec437eee0"). InnerVolumeSpecName "kube-api-access-pbgp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:05:25 crc kubenswrapper[4931]: I1201 16:05:25.988104 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbgp9\" (UniqueName: \"kubernetes.io/projected/44057f8b-7f73-43ad-807b-0dcec437eee0-kube-api-access-pbgp9\") on node \"crc\" DevicePath \"\"" Dec 01 16:05:25 crc kubenswrapper[4931]: I1201 16:05:25.988139 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44057f8b-7f73-43ad-807b-0dcec437eee0-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 16:05:25 crc kubenswrapper[4931]: I1201 16:05:25.992628 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44057f8b-7f73-43ad-807b-0dcec437eee0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44057f8b-7f73-43ad-807b-0dcec437eee0" (UID: "44057f8b-7f73-43ad-807b-0dcec437eee0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.090331 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44057f8b-7f73-43ad-807b-0dcec437eee0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.425186 4931 generic.go:334] "Generic (PLEG): container finished" podID="44057f8b-7f73-43ad-807b-0dcec437eee0" containerID="c1788994bbd5e14a660652525af1a0501d831fc4aa53acd532b5ba323f2de676" exitCode=0 Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.425237 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2f67t" event={"ID":"44057f8b-7f73-43ad-807b-0dcec437eee0","Type":"ContainerDied","Data":"c1788994bbd5e14a660652525af1a0501d831fc4aa53acd532b5ba323f2de676"} Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.425269 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2f67t" event={"ID":"44057f8b-7f73-43ad-807b-0dcec437eee0","Type":"ContainerDied","Data":"24bb131ea51db3b5efd2c4c4c70ff00d4a54f19df861b36a1144f1ea61d403ab"} Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.425291 4931 scope.go:117] "RemoveContainer" containerID="c1788994bbd5e14a660652525af1a0501d831fc4aa53acd532b5ba323f2de676" Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.425453 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2f67t" Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.455351 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2f67t"] Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.456359 4931 scope.go:117] "RemoveContainer" containerID="0033a8e9deb15ea65067f1df726419a1e877591cf63efdfb0c8e898094023676" Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.467575 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2f67t"] Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.480502 4931 scope.go:117] "RemoveContainer" containerID="8034dbfe13d55b23c67a88da4c017d5ee1b4adc3fc36bf296aa0643bb96f03b5" Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.535976 4931 scope.go:117] "RemoveContainer" containerID="c1788994bbd5e14a660652525af1a0501d831fc4aa53acd532b5ba323f2de676" Dec 01 16:05:26 crc kubenswrapper[4931]: E1201 16:05:26.536733 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1788994bbd5e14a660652525af1a0501d831fc4aa53acd532b5ba323f2de676\": container with ID starting with c1788994bbd5e14a660652525af1a0501d831fc4aa53acd532b5ba323f2de676 not found: ID does not exist" containerID="c1788994bbd5e14a660652525af1a0501d831fc4aa53acd532b5ba323f2de676" Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.536773 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1788994bbd5e14a660652525af1a0501d831fc4aa53acd532b5ba323f2de676"} err="failed to get container status \"c1788994bbd5e14a660652525af1a0501d831fc4aa53acd532b5ba323f2de676\": rpc error: code = NotFound desc = could not find container \"c1788994bbd5e14a660652525af1a0501d831fc4aa53acd532b5ba323f2de676\": container with ID starting with c1788994bbd5e14a660652525af1a0501d831fc4aa53acd532b5ba323f2de676 not found: ID does not exist" Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.536798 4931 scope.go:117] "RemoveContainer" containerID="0033a8e9deb15ea65067f1df726419a1e877591cf63efdfb0c8e898094023676" Dec 01 16:05:26 crc kubenswrapper[4931]: E1201 16:05:26.537328 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0033a8e9deb15ea65067f1df726419a1e877591cf63efdfb0c8e898094023676\": container with ID starting with 0033a8e9deb15ea65067f1df726419a1e877591cf63efdfb0c8e898094023676 not found: ID does not exist" containerID="0033a8e9deb15ea65067f1df726419a1e877591cf63efdfb0c8e898094023676" Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.537367 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0033a8e9deb15ea65067f1df726419a1e877591cf63efdfb0c8e898094023676"} err="failed to get container status \"0033a8e9deb15ea65067f1df726419a1e877591cf63efdfb0c8e898094023676\": rpc error: code = NotFound desc = could not find container \"0033a8e9deb15ea65067f1df726419a1e877591cf63efdfb0c8e898094023676\": container with ID starting with 0033a8e9deb15ea65067f1df726419a1e877591cf63efdfb0c8e898094023676 not found: ID does not exist" Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.537508 4931 scope.go:117] "RemoveContainer" containerID="8034dbfe13d55b23c67a88da4c017d5ee1b4adc3fc36bf296aa0643bb96f03b5" Dec 01 16:05:26 crc kubenswrapper[4931]: E1201 16:05:26.538055 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8034dbfe13d55b23c67a88da4c017d5ee1b4adc3fc36bf296aa0643bb96f03b5\": container with ID starting with 8034dbfe13d55b23c67a88da4c017d5ee1b4adc3fc36bf296aa0643bb96f03b5 not found: ID does not exist" containerID="8034dbfe13d55b23c67a88da4c017d5ee1b4adc3fc36bf296aa0643bb96f03b5" Dec 01 16:05:26 crc kubenswrapper[4931]: I1201 16:05:26.538100 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8034dbfe13d55b23c67a88da4c017d5ee1b4adc3fc36bf296aa0643bb96f03b5"} err="failed to get container status \"8034dbfe13d55b23c67a88da4c017d5ee1b4adc3fc36bf296aa0643bb96f03b5\": rpc error: code = NotFound desc = could not find container \"8034dbfe13d55b23c67a88da4c017d5ee1b4adc3fc36bf296aa0643bb96f03b5\": container with ID starting with 8034dbfe13d55b23c67a88da4c017d5ee1b4adc3fc36bf296aa0643bb96f03b5 not found: ID does not exist" Dec 01 16:05:28 crc kubenswrapper[4931]: I1201 16:05:28.258575 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44057f8b-7f73-43ad-807b-0dcec437eee0" path="/var/lib/kubelet/pods/44057f8b-7f73-43ad-807b-0dcec437eee0/volumes" Dec 01 16:05:28 crc kubenswrapper[4931]: I1201 16:05:28.940549 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:05:29 crc kubenswrapper[4931]: I1201 16:05:29.014877 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n85b4" Dec 01 16:05:29 crc kubenswrapper[4931]: I1201 16:05:29.857350 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n85b4"] Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.020686 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wxd5j"] Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.020975 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wxd5j" podUID="c3575355-a4fc-4f78-8512-79f2fb4bd449" containerName="registry-server" containerID="cri-o://80346b57b3fb2801c5687bacc00f966c74cecf2ac83c8010e84b5b772a016a62" gracePeriod=2 Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.241992 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:05:30 crc kubenswrapper[4931]: E1201 16:05:30.242559 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.470327 4931 generic.go:334] "Generic (PLEG): container finished" podID="c3575355-a4fc-4f78-8512-79f2fb4bd449" containerID="80346b57b3fb2801c5687bacc00f966c74cecf2ac83c8010e84b5b772a016a62" exitCode=0 Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.470438 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxd5j" event={"ID":"c3575355-a4fc-4f78-8512-79f2fb4bd449","Type":"ContainerDied","Data":"80346b57b3fb2801c5687bacc00f966c74cecf2ac83c8010e84b5b772a016a62"} Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.470507 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxd5j" event={"ID":"c3575355-a4fc-4f78-8512-79f2fb4bd449","Type":"ContainerDied","Data":"b2914746783150d05d30d52fca96e1b6fb6a80f41200c1e051037cc1e87c17af"} Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.470524 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2914746783150d05d30d52fca96e1b6fb6a80f41200c1e051037cc1e87c17af" Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.482744 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.586690 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3575355-a4fc-4f78-8512-79f2fb4bd449-utilities\") pod \"c3575355-a4fc-4f78-8512-79f2fb4bd449\" (UID: \"c3575355-a4fc-4f78-8512-79f2fb4bd449\") " Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.586873 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvnzq\" (UniqueName: \"kubernetes.io/projected/c3575355-a4fc-4f78-8512-79f2fb4bd449-kube-api-access-gvnzq\") pod \"c3575355-a4fc-4f78-8512-79f2fb4bd449\" (UID: \"c3575355-a4fc-4f78-8512-79f2fb4bd449\") " Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.586963 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3575355-a4fc-4f78-8512-79f2fb4bd449-catalog-content\") pod \"c3575355-a4fc-4f78-8512-79f2fb4bd449\" (UID: \"c3575355-a4fc-4f78-8512-79f2fb4bd449\") " Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.588618 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3575355-a4fc-4f78-8512-79f2fb4bd449-utilities" (OuterVolumeSpecName: "utilities") pod "c3575355-a4fc-4f78-8512-79f2fb4bd449" (UID: "c3575355-a4fc-4f78-8512-79f2fb4bd449"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.594609 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3575355-a4fc-4f78-8512-79f2fb4bd449-kube-api-access-gvnzq" (OuterVolumeSpecName: "kube-api-access-gvnzq") pod "c3575355-a4fc-4f78-8512-79f2fb4bd449" (UID: "c3575355-a4fc-4f78-8512-79f2fb4bd449"). InnerVolumeSpecName "kube-api-access-gvnzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.689246 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvnzq\" (UniqueName: \"kubernetes.io/projected/c3575355-a4fc-4f78-8512-79f2fb4bd449-kube-api-access-gvnzq\") on node \"crc\" DevicePath \"\"" Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.689284 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3575355-a4fc-4f78-8512-79f2fb4bd449-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.698524 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3575355-a4fc-4f78-8512-79f2fb4bd449-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3575355-a4fc-4f78-8512-79f2fb4bd449" (UID: "c3575355-a4fc-4f78-8512-79f2fb4bd449"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:05:30 crc kubenswrapper[4931]: I1201 16:05:30.791624 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3575355-a4fc-4f78-8512-79f2fb4bd449-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 16:05:31 crc kubenswrapper[4931]: I1201 16:05:31.489983 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxd5j" Dec 01 16:05:31 crc kubenswrapper[4931]: I1201 16:05:31.636973 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wxd5j"] Dec 01 16:05:31 crc kubenswrapper[4931]: I1201 16:05:31.647179 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wxd5j"] Dec 01 16:05:32 crc kubenswrapper[4931]: I1201 16:05:32.251747 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3575355-a4fc-4f78-8512-79f2fb4bd449" path="/var/lib/kubelet/pods/c3575355-a4fc-4f78-8512-79f2fb4bd449/volumes" Dec 01 16:05:38 crc kubenswrapper[4931]: I1201 16:05:38.566691 4931 generic.go:334] "Generic (PLEG): container finished" podID="73286ae8-c068-48b6-ac89-8f5807eb0e54" containerID="5c806437690342fcf8b5d00b937642795f07ce5585284606f50affb8ce833f66" exitCode=0 Dec 01 16:05:38 crc kubenswrapper[4931]: I1201 16:05:38.566850 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p88bd/must-gather-84mmd" event={"ID":"73286ae8-c068-48b6-ac89-8f5807eb0e54","Type":"ContainerDied","Data":"5c806437690342fcf8b5d00b937642795f07ce5585284606f50affb8ce833f66"} Dec 01 16:05:38 crc kubenswrapper[4931]: I1201 16:05:38.567825 4931 scope.go:117] "RemoveContainer" containerID="5c806437690342fcf8b5d00b937642795f07ce5585284606f50affb8ce833f66" Dec 01 16:05:38 crc kubenswrapper[4931]: I1201 16:05:38.880319 4931 scope.go:117] "RemoveContainer" containerID="80346b57b3fb2801c5687bacc00f966c74cecf2ac83c8010e84b5b772a016a62" Dec 01 16:05:38 crc kubenswrapper[4931]: I1201 16:05:38.916231 4931 scope.go:117] "RemoveContainer" containerID="0cc5d3e1b3fe6a36d6cc91136c9199460adbdf0e334616c49be5e40b75487e2d" Dec 01 16:05:38 crc kubenswrapper[4931]: I1201 16:05:38.951168 4931 scope.go:117] "RemoveContainer" containerID="206cb841cfa096041275dfc232e2ab7f03df385fe677bbf0875a53a0ba4208a4" Dec 01 16:05:39 crc kubenswrapper[4931]: I1201 16:05:39.467418 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p88bd_must-gather-84mmd_73286ae8-c068-48b6-ac89-8f5807eb0e54/gather/0.log" Dec 01 16:05:43 crc kubenswrapper[4931]: I1201 16:05:43.242326 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:05:43 crc kubenswrapper[4931]: E1201 16:05:43.243482 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:05:47 crc kubenswrapper[4931]: I1201 16:05:47.450600 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p88bd/must-gather-84mmd"] Dec 01 16:05:47 crc kubenswrapper[4931]: I1201 16:05:47.451582 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-p88bd/must-gather-84mmd" podUID="73286ae8-c068-48b6-ac89-8f5807eb0e54" containerName="copy" containerID="cri-o://caffc891a42f82cd2b0b08f92a8ed83901eb1e3be9c8cb528e52019581b4086a" gracePeriod=2 Dec 01 16:05:47 crc kubenswrapper[4931]: I1201 16:05:47.460342 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p88bd/must-gather-84mmd"] Dec 01 16:05:47 crc kubenswrapper[4931]: I1201 16:05:47.711603 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p88bd_must-gather-84mmd_73286ae8-c068-48b6-ac89-8f5807eb0e54/copy/0.log" Dec 01 16:05:47 crc kubenswrapper[4931]: I1201 16:05:47.712543 4931 generic.go:334] "Generic (PLEG): container finished" podID="73286ae8-c068-48b6-ac89-8f5807eb0e54" containerID="caffc891a42f82cd2b0b08f92a8ed83901eb1e3be9c8cb528e52019581b4086a" exitCode=143 Dec 01 16:05:47 crc kubenswrapper[4931]: I1201 16:05:47.930316 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p88bd_must-gather-84mmd_73286ae8-c068-48b6-ac89-8f5807eb0e54/copy/0.log" Dec 01 16:05:47 crc kubenswrapper[4931]: I1201 16:05:47.930871 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/must-gather-84mmd" Dec 01 16:05:48 crc kubenswrapper[4931]: I1201 16:05:48.010097 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww4x9\" (UniqueName: \"kubernetes.io/projected/73286ae8-c068-48b6-ac89-8f5807eb0e54-kube-api-access-ww4x9\") pod \"73286ae8-c068-48b6-ac89-8f5807eb0e54\" (UID: \"73286ae8-c068-48b6-ac89-8f5807eb0e54\") " Dec 01 16:05:48 crc kubenswrapper[4931]: I1201 16:05:48.037785 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73286ae8-c068-48b6-ac89-8f5807eb0e54-kube-api-access-ww4x9" (OuterVolumeSpecName: "kube-api-access-ww4x9") pod "73286ae8-c068-48b6-ac89-8f5807eb0e54" (UID: "73286ae8-c068-48b6-ac89-8f5807eb0e54"). InnerVolumeSpecName "kube-api-access-ww4x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:05:48 crc kubenswrapper[4931]: I1201 16:05:48.111875 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73286ae8-c068-48b6-ac89-8f5807eb0e54-must-gather-output\") pod \"73286ae8-c068-48b6-ac89-8f5807eb0e54\" (UID: \"73286ae8-c068-48b6-ac89-8f5807eb0e54\") " Dec 01 16:05:48 crc kubenswrapper[4931]: I1201 16:05:48.112420 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww4x9\" (UniqueName: \"kubernetes.io/projected/73286ae8-c068-48b6-ac89-8f5807eb0e54-kube-api-access-ww4x9\") on node \"crc\" DevicePath \"\"" Dec 01 16:05:48 crc kubenswrapper[4931]: I1201 16:05:48.230825 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73286ae8-c068-48b6-ac89-8f5807eb0e54-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "73286ae8-c068-48b6-ac89-8f5807eb0e54" (UID: "73286ae8-c068-48b6-ac89-8f5807eb0e54"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:05:48 crc kubenswrapper[4931]: I1201 16:05:48.251260 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73286ae8-c068-48b6-ac89-8f5807eb0e54" path="/var/lib/kubelet/pods/73286ae8-c068-48b6-ac89-8f5807eb0e54/volumes" Dec 01 16:05:48 crc kubenswrapper[4931]: I1201 16:05:48.315291 4931 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/73286ae8-c068-48b6-ac89-8f5807eb0e54-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 16:05:48 crc kubenswrapper[4931]: I1201 16:05:48.728592 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p88bd_must-gather-84mmd_73286ae8-c068-48b6-ac89-8f5807eb0e54/copy/0.log" Dec 01 16:05:48 crc kubenswrapper[4931]: I1201 16:05:48.729382 4931 scope.go:117] "RemoveContainer" containerID="caffc891a42f82cd2b0b08f92a8ed83901eb1e3be9c8cb528e52019581b4086a" Dec 01 16:05:48 crc kubenswrapper[4931]: I1201 16:05:48.729580 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p88bd/must-gather-84mmd" Dec 01 16:05:48 crc kubenswrapper[4931]: I1201 16:05:48.751558 4931 scope.go:117] "RemoveContainer" containerID="5c806437690342fcf8b5d00b937642795f07ce5585284606f50affb8ce833f66" Dec 01 16:05:56 crc kubenswrapper[4931]: I1201 16:05:56.242292 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:05:56 crc kubenswrapper[4931]: E1201 16:05:56.243433 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:06:10 crc kubenswrapper[4931]: I1201 16:06:10.241895 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:06:10 crc kubenswrapper[4931]: E1201 16:06:10.243325 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:06:21 crc kubenswrapper[4931]: I1201 16:06:21.242370 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:06:21 crc kubenswrapper[4931]: E1201 16:06:21.243198 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:06:33 crc kubenswrapper[4931]: I1201 16:06:33.241363 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:06:33 crc kubenswrapper[4931]: E1201 16:06:33.242031 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:06:47 crc kubenswrapper[4931]: I1201 16:06:47.241736 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:06:47 crc kubenswrapper[4931]: E1201 16:06:47.243884 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:06:58 crc kubenswrapper[4931]: I1201 16:06:58.242177 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:06:58 crc kubenswrapper[4931]: E1201 16:06:58.243037 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:07:10 crc kubenswrapper[4931]: I1201 16:07:10.241356 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:07:10 crc kubenswrapper[4931]: E1201 16:07:10.243117 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:07:23 crc kubenswrapper[4931]: I1201 16:07:23.242040 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:07:23 crc kubenswrapper[4931]: E1201 16:07:23.243203 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:07:34 crc kubenswrapper[4931]: I1201 16:07:34.252026 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:07:34 crc kubenswrapper[4931]: E1201 16:07:34.253115 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:07:39 crc kubenswrapper[4931]: I1201 16:07:39.125932 4931 scope.go:117] "RemoveContainer" containerID="55a4d357a24d3405fd1d1e627a9990193564bdc847b32e93d5db6777b95a7e73" Dec 01 16:07:39 crc kubenswrapper[4931]: I1201 16:07:39.169216 4931 scope.go:117] "RemoveContainer" containerID="bbc15a9d22ab98bdf1fea6046732f7eb8f2e3516da125c14f3f93f03a4cdc30e" Dec 01 16:07:49 crc kubenswrapper[4931]: I1201 16:07:49.241824 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:07:49 crc kubenswrapper[4931]: E1201 16:07:49.242540 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:08:00 crc kubenswrapper[4931]: I1201 16:08:00.243019 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:08:00 crc kubenswrapper[4931]: E1201 16:08:00.244188 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:08:11 crc kubenswrapper[4931]: I1201 16:08:11.242368 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:08:11 crc kubenswrapper[4931]: E1201 16:08:11.243581 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.531121 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4xsrk/must-gather-j4z5z"] Dec 01 16:08:22 crc kubenswrapper[4931]: E1201 16:08:22.533374 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3575355-a4fc-4f78-8512-79f2fb4bd449" containerName="extract-content" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.533410 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3575355-a4fc-4f78-8512-79f2fb4bd449" containerName="extract-content" Dec 01 16:08:22 crc kubenswrapper[4931]: E1201 16:08:22.533436 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3575355-a4fc-4f78-8512-79f2fb4bd449" containerName="extract-utilities" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.533443 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3575355-a4fc-4f78-8512-79f2fb4bd449" containerName="extract-utilities" Dec 01 16:08:22 crc kubenswrapper[4931]: E1201 16:08:22.533454 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44057f8b-7f73-43ad-807b-0dcec437eee0" containerName="registry-server" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.533460 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="44057f8b-7f73-43ad-807b-0dcec437eee0" containerName="registry-server" Dec 01 16:08:22 crc kubenswrapper[4931]: E1201 16:08:22.533488 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44057f8b-7f73-43ad-807b-0dcec437eee0" containerName="extract-utilities" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.533494 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="44057f8b-7f73-43ad-807b-0dcec437eee0" containerName="extract-utilities" Dec 01 16:08:22 crc kubenswrapper[4931]: E1201 16:08:22.533501 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73286ae8-c068-48b6-ac89-8f5807eb0e54" containerName="copy" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.533507 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="73286ae8-c068-48b6-ac89-8f5807eb0e54" containerName="copy" Dec 01 16:08:22 crc kubenswrapper[4931]: E1201 16:08:22.533517 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73286ae8-c068-48b6-ac89-8f5807eb0e54" containerName="gather" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.533522 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="73286ae8-c068-48b6-ac89-8f5807eb0e54" containerName="gather" Dec 01 16:08:22 crc kubenswrapper[4931]: E1201 16:08:22.533529 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44057f8b-7f73-43ad-807b-0dcec437eee0" containerName="extract-content" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.533535 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="44057f8b-7f73-43ad-807b-0dcec437eee0" containerName="extract-content" Dec 01 16:08:22 crc kubenswrapper[4931]: E1201 16:08:22.533546 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3575355-a4fc-4f78-8512-79f2fb4bd449" containerName="registry-server" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.533552 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3575355-a4fc-4f78-8512-79f2fb4bd449" containerName="registry-server" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.533710 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="44057f8b-7f73-43ad-807b-0dcec437eee0" containerName="registry-server" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.533723 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="73286ae8-c068-48b6-ac89-8f5807eb0e54" containerName="gather" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.533738 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="73286ae8-c068-48b6-ac89-8f5807eb0e54" containerName="copy" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.533746 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3575355-a4fc-4f78-8512-79f2fb4bd449" containerName="registry-server" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.538508 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/must-gather-j4z5z" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.553483 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4xsrk"/"openshift-service-ca.crt" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.553705 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4xsrk"/"kube-root-ca.crt" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.566834 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4xsrk/must-gather-j4z5z"] Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.650472 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6mz8\" (UniqueName: \"kubernetes.io/projected/a59b11df-2c97-4488-93d7-b4ce4e125e80-kube-api-access-d6mz8\") pod \"must-gather-j4z5z\" (UID: \"a59b11df-2c97-4488-93d7-b4ce4e125e80\") " pod="openshift-must-gather-4xsrk/must-gather-j4z5z" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.650615 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a59b11df-2c97-4488-93d7-b4ce4e125e80-must-gather-output\") pod \"must-gather-j4z5z\" (UID: \"a59b11df-2c97-4488-93d7-b4ce4e125e80\") " pod="openshift-must-gather-4xsrk/must-gather-j4z5z" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.752304 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a59b11df-2c97-4488-93d7-b4ce4e125e80-must-gather-output\") pod \"must-gather-j4z5z\" (UID: \"a59b11df-2c97-4488-93d7-b4ce4e125e80\") " pod="openshift-must-gather-4xsrk/must-gather-j4z5z" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.752416 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6mz8\" (UniqueName: \"kubernetes.io/projected/a59b11df-2c97-4488-93d7-b4ce4e125e80-kube-api-access-d6mz8\") pod \"must-gather-j4z5z\" (UID: \"a59b11df-2c97-4488-93d7-b4ce4e125e80\") " pod="openshift-must-gather-4xsrk/must-gather-j4z5z" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.753222 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a59b11df-2c97-4488-93d7-b4ce4e125e80-must-gather-output\") pod \"must-gather-j4z5z\" (UID: \"a59b11df-2c97-4488-93d7-b4ce4e125e80\") " pod="openshift-must-gather-4xsrk/must-gather-j4z5z" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.778274 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6mz8\" (UniqueName: \"kubernetes.io/projected/a59b11df-2c97-4488-93d7-b4ce4e125e80-kube-api-access-d6mz8\") pod \"must-gather-j4z5z\" (UID: \"a59b11df-2c97-4488-93d7-b4ce4e125e80\") " pod="openshift-must-gather-4xsrk/must-gather-j4z5z" Dec 01 16:08:22 crc kubenswrapper[4931]: I1201 16:08:22.865172 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/must-gather-j4z5z" Dec 01 16:08:23 crc kubenswrapper[4931]: I1201 16:08:23.462286 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4xsrk/must-gather-j4z5z"] Dec 01 16:08:24 crc kubenswrapper[4931]: I1201 16:08:24.162205 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsrk/must-gather-j4z5z" event={"ID":"a59b11df-2c97-4488-93d7-b4ce4e125e80","Type":"ContainerStarted","Data":"832164bc3d0c7746d9040e0371447092d9d0ded96626a23ff06a288e756fd7c8"} Dec 01 16:08:24 crc kubenswrapper[4931]: I1201 16:08:24.162752 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsrk/must-gather-j4z5z" event={"ID":"a59b11df-2c97-4488-93d7-b4ce4e125e80","Type":"ContainerStarted","Data":"6f939bdf9fb2b88a5cdd784047392c3643a6cb6c88c2d18d7517d8fe351b9f55"} Dec 01 16:08:24 crc kubenswrapper[4931]: I1201 16:08:24.162766 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsrk/must-gather-j4z5z" event={"ID":"a59b11df-2c97-4488-93d7-b4ce4e125e80","Type":"ContainerStarted","Data":"c39d28a33292722988c8d3124d8a206b1451cdbe6e1f582ee314085c9d16eca8"} Dec 01 16:08:24 crc kubenswrapper[4931]: I1201 16:08:24.181862 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4xsrk/must-gather-j4z5z" podStartSLOduration=2.18184346 podStartE2EDuration="2.18184346s" podCreationTimestamp="2025-12-01 16:08:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 16:08:24.175201243 +0000 UTC m=+4050.601074910" watchObservedRunningTime="2025-12-01 16:08:24.18184346 +0000 UTC m=+4050.607717127" Dec 01 16:08:25 crc kubenswrapper[4931]: I1201 16:08:25.241752 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:08:25 crc kubenswrapper[4931]: E1201 16:08:25.242639 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:08:27 crc kubenswrapper[4931]: I1201 16:08:27.196362 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4xsrk/crc-debug-thdcf"] Dec 01 16:08:27 crc kubenswrapper[4931]: I1201 16:08:27.200277 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/crc-debug-thdcf" Dec 01 16:08:27 crc kubenswrapper[4931]: I1201 16:08:27.202728 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4xsrk"/"default-dockercfg-v7fdk" Dec 01 16:08:27 crc kubenswrapper[4931]: I1201 16:08:27.346048 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9153d0e6-dc11-41a6-8c35-11228583bfda-host\") pod \"crc-debug-thdcf\" (UID: \"9153d0e6-dc11-41a6-8c35-11228583bfda\") " pod="openshift-must-gather-4xsrk/crc-debug-thdcf" Dec 01 16:08:27 crc kubenswrapper[4931]: I1201 16:08:27.346506 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlzcn\" (UniqueName: \"kubernetes.io/projected/9153d0e6-dc11-41a6-8c35-11228583bfda-kube-api-access-vlzcn\") pod \"crc-debug-thdcf\" (UID: \"9153d0e6-dc11-41a6-8c35-11228583bfda\") " pod="openshift-must-gather-4xsrk/crc-debug-thdcf" Dec 01 16:08:27 crc kubenswrapper[4931]: I1201 16:08:27.448118 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9153d0e6-dc11-41a6-8c35-11228583bfda-host\") pod \"crc-debug-thdcf\" (UID: \"9153d0e6-dc11-41a6-8c35-11228583bfda\") " pod="openshift-must-gather-4xsrk/crc-debug-thdcf" Dec 01 16:08:27 crc kubenswrapper[4931]: I1201 16:08:27.448318 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9153d0e6-dc11-41a6-8c35-11228583bfda-host\") pod \"crc-debug-thdcf\" (UID: \"9153d0e6-dc11-41a6-8c35-11228583bfda\") " pod="openshift-must-gather-4xsrk/crc-debug-thdcf" Dec 01 16:08:27 crc kubenswrapper[4931]: I1201 16:08:27.449179 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlzcn\" (UniqueName: \"kubernetes.io/projected/9153d0e6-dc11-41a6-8c35-11228583bfda-kube-api-access-vlzcn\") pod \"crc-debug-thdcf\" (UID: \"9153d0e6-dc11-41a6-8c35-11228583bfda\") " pod="openshift-must-gather-4xsrk/crc-debug-thdcf" Dec 01 16:08:27 crc kubenswrapper[4931]: I1201 16:08:27.469990 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlzcn\" (UniqueName: \"kubernetes.io/projected/9153d0e6-dc11-41a6-8c35-11228583bfda-kube-api-access-vlzcn\") pod \"crc-debug-thdcf\" (UID: \"9153d0e6-dc11-41a6-8c35-11228583bfda\") " pod="openshift-must-gather-4xsrk/crc-debug-thdcf" Dec 01 16:08:27 crc kubenswrapper[4931]: I1201 16:08:27.544057 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/crc-debug-thdcf" Dec 01 16:08:27 crc kubenswrapper[4931]: W1201 16:08:27.580462 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9153d0e6_dc11_41a6_8c35_11228583bfda.slice/crio-d2689c06cf1b1eecd01c9a416357a40266a473377a73cac64d41c0caf25ff659 WatchSource:0}: Error finding container d2689c06cf1b1eecd01c9a416357a40266a473377a73cac64d41c0caf25ff659: Status 404 returned error can't find the container with id d2689c06cf1b1eecd01c9a416357a40266a473377a73cac64d41c0caf25ff659 Dec 01 16:08:28 crc kubenswrapper[4931]: I1201 16:08:28.197609 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsrk/crc-debug-thdcf" event={"ID":"9153d0e6-dc11-41a6-8c35-11228583bfda","Type":"ContainerStarted","Data":"42a0e553c196d833ed427cc59ba4742cfe5f3036cc8befae44a55f20d4464a45"} Dec 01 16:08:28 crc kubenswrapper[4931]: I1201 16:08:28.198235 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsrk/crc-debug-thdcf" event={"ID":"9153d0e6-dc11-41a6-8c35-11228583bfda","Type":"ContainerStarted","Data":"d2689c06cf1b1eecd01c9a416357a40266a473377a73cac64d41c0caf25ff659"} Dec 01 16:08:28 crc kubenswrapper[4931]: I1201 16:08:28.218105 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4xsrk/crc-debug-thdcf" podStartSLOduration=1.21808493 podStartE2EDuration="1.21808493s" podCreationTimestamp="2025-12-01 16:08:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 16:08:28.214499129 +0000 UTC m=+4054.640372816" watchObservedRunningTime="2025-12-01 16:08:28.21808493 +0000 UTC m=+4054.643958597" Dec 01 16:08:37 crc kubenswrapper[4931]: I1201 16:08:37.241700 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:08:37 crc kubenswrapper[4931]: E1201 16:08:37.242595 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:08:39 crc kubenswrapper[4931]: I1201 16:08:39.266120 4931 scope.go:117] "RemoveContainer" containerID="7ef2fef68a10addc021b804bb152778a996dce28f34da94aeeab2eea48f6ad49" Dec 01 16:08:48 crc kubenswrapper[4931]: I1201 16:08:48.241453 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:08:48 crc kubenswrapper[4931]: E1201 16:08:48.242281 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:09:01 crc kubenswrapper[4931]: I1201 16:09:01.480948 4931 generic.go:334] "Generic (PLEG): container finished" podID="9153d0e6-dc11-41a6-8c35-11228583bfda" containerID="42a0e553c196d833ed427cc59ba4742cfe5f3036cc8befae44a55f20d4464a45" exitCode=0 Dec 01 16:09:01 crc kubenswrapper[4931]: I1201 16:09:01.481106 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsrk/crc-debug-thdcf" event={"ID":"9153d0e6-dc11-41a6-8c35-11228583bfda","Type":"ContainerDied","Data":"42a0e553c196d833ed427cc59ba4742cfe5f3036cc8befae44a55f20d4464a45"} Dec 01 16:09:02 crc kubenswrapper[4931]: I1201 16:09:02.248620 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:09:02 crc kubenswrapper[4931]: E1201 16:09:02.249482 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:09:02 crc kubenswrapper[4931]: I1201 16:09:02.578183 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/crc-debug-thdcf" Dec 01 16:09:02 crc kubenswrapper[4931]: I1201 16:09:02.610079 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4xsrk/crc-debug-thdcf"] Dec 01 16:09:02 crc kubenswrapper[4931]: I1201 16:09:02.617551 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4xsrk/crc-debug-thdcf"] Dec 01 16:09:02 crc kubenswrapper[4931]: I1201 16:09:02.727444 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlzcn\" (UniqueName: \"kubernetes.io/projected/9153d0e6-dc11-41a6-8c35-11228583bfda-kube-api-access-vlzcn\") pod \"9153d0e6-dc11-41a6-8c35-11228583bfda\" (UID: \"9153d0e6-dc11-41a6-8c35-11228583bfda\") " Dec 01 16:09:02 crc kubenswrapper[4931]: I1201 16:09:02.727523 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9153d0e6-dc11-41a6-8c35-11228583bfda-host\") pod \"9153d0e6-dc11-41a6-8c35-11228583bfda\" (UID: \"9153d0e6-dc11-41a6-8c35-11228583bfda\") " Dec 01 16:09:02 crc kubenswrapper[4931]: I1201 16:09:02.727915 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9153d0e6-dc11-41a6-8c35-11228583bfda-host" (OuterVolumeSpecName: "host") pod "9153d0e6-dc11-41a6-8c35-11228583bfda" (UID: "9153d0e6-dc11-41a6-8c35-11228583bfda"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 16:09:02 crc kubenswrapper[4931]: I1201 16:09:02.740593 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9153d0e6-dc11-41a6-8c35-11228583bfda-kube-api-access-vlzcn" (OuterVolumeSpecName: "kube-api-access-vlzcn") pod "9153d0e6-dc11-41a6-8c35-11228583bfda" (UID: "9153d0e6-dc11-41a6-8c35-11228583bfda"). InnerVolumeSpecName "kube-api-access-vlzcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:09:02 crc kubenswrapper[4931]: I1201 16:09:02.829761 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlzcn\" (UniqueName: \"kubernetes.io/projected/9153d0e6-dc11-41a6-8c35-11228583bfda-kube-api-access-vlzcn\") on node \"crc\" DevicePath \"\"" Dec 01 16:09:02 crc kubenswrapper[4931]: I1201 16:09:02.829797 4931 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9153d0e6-dc11-41a6-8c35-11228583bfda-host\") on node \"crc\" DevicePath \"\"" Dec 01 16:09:03 crc kubenswrapper[4931]: I1201 16:09:03.500138 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2689c06cf1b1eecd01c9a416357a40266a473377a73cac64d41c0caf25ff659" Dec 01 16:09:03 crc kubenswrapper[4931]: I1201 16:09:03.500300 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/crc-debug-thdcf" Dec 01 16:09:03 crc kubenswrapper[4931]: I1201 16:09:03.793607 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4xsrk/crc-debug-lnfxn"] Dec 01 16:09:03 crc kubenswrapper[4931]: E1201 16:09:03.794151 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9153d0e6-dc11-41a6-8c35-11228583bfda" containerName="container-00" Dec 01 16:09:03 crc kubenswrapper[4931]: I1201 16:09:03.794172 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="9153d0e6-dc11-41a6-8c35-11228583bfda" containerName="container-00" Dec 01 16:09:03 crc kubenswrapper[4931]: I1201 16:09:03.794472 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="9153d0e6-dc11-41a6-8c35-11228583bfda" containerName="container-00" Dec 01 16:09:03 crc kubenswrapper[4931]: I1201 16:09:03.795263 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/crc-debug-lnfxn" Dec 01 16:09:03 crc kubenswrapper[4931]: I1201 16:09:03.797845 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4xsrk"/"default-dockercfg-v7fdk" Dec 01 16:09:03 crc kubenswrapper[4931]: I1201 16:09:03.951317 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57f52646-fe0d-4a22-b3a3-70cf71bc7bf3-host\") pod \"crc-debug-lnfxn\" (UID: \"57f52646-fe0d-4a22-b3a3-70cf71bc7bf3\") " pod="openshift-must-gather-4xsrk/crc-debug-lnfxn" Dec 01 16:09:03 crc kubenswrapper[4931]: I1201 16:09:03.951715 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq7bt\" (UniqueName: \"kubernetes.io/projected/57f52646-fe0d-4a22-b3a3-70cf71bc7bf3-kube-api-access-hq7bt\") pod \"crc-debug-lnfxn\" (UID: \"57f52646-fe0d-4a22-b3a3-70cf71bc7bf3\") " pod="openshift-must-gather-4xsrk/crc-debug-lnfxn" Dec 01 16:09:04 crc kubenswrapper[4931]: I1201 16:09:04.053942 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57f52646-fe0d-4a22-b3a3-70cf71bc7bf3-host\") pod \"crc-debug-lnfxn\" (UID: \"57f52646-fe0d-4a22-b3a3-70cf71bc7bf3\") " pod="openshift-must-gather-4xsrk/crc-debug-lnfxn" Dec 01 16:09:04 crc kubenswrapper[4931]: I1201 16:09:04.054046 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq7bt\" (UniqueName: \"kubernetes.io/projected/57f52646-fe0d-4a22-b3a3-70cf71bc7bf3-kube-api-access-hq7bt\") pod \"crc-debug-lnfxn\" (UID: \"57f52646-fe0d-4a22-b3a3-70cf71bc7bf3\") " pod="openshift-must-gather-4xsrk/crc-debug-lnfxn" Dec 01 16:09:04 crc kubenswrapper[4931]: I1201 16:09:04.054182 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57f52646-fe0d-4a22-b3a3-70cf71bc7bf3-host\") pod \"crc-debug-lnfxn\" (UID: \"57f52646-fe0d-4a22-b3a3-70cf71bc7bf3\") " pod="openshift-must-gather-4xsrk/crc-debug-lnfxn" Dec 01 16:09:04 crc kubenswrapper[4931]: I1201 16:09:04.263224 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9153d0e6-dc11-41a6-8c35-11228583bfda" path="/var/lib/kubelet/pods/9153d0e6-dc11-41a6-8c35-11228583bfda/volumes" Dec 01 16:09:04 crc kubenswrapper[4931]: I1201 16:09:04.316555 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq7bt\" (UniqueName: \"kubernetes.io/projected/57f52646-fe0d-4a22-b3a3-70cf71bc7bf3-kube-api-access-hq7bt\") pod \"crc-debug-lnfxn\" (UID: \"57f52646-fe0d-4a22-b3a3-70cf71bc7bf3\") " pod="openshift-must-gather-4xsrk/crc-debug-lnfxn" Dec 01 16:09:04 crc kubenswrapper[4931]: I1201 16:09:04.427362 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/crc-debug-lnfxn" Dec 01 16:09:04 crc kubenswrapper[4931]: I1201 16:09:04.508630 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsrk/crc-debug-lnfxn" event={"ID":"57f52646-fe0d-4a22-b3a3-70cf71bc7bf3","Type":"ContainerStarted","Data":"28999963f69acd2668971f8a9e83aa87e30599487effbd774e43a27fcc4ed921"} Dec 01 16:09:04 crc kubenswrapper[4931]: E1201 16:09:04.916551 4931 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57f52646_fe0d_4a22_b3a3_70cf71bc7bf3.slice/crio-2bf9e2b4823c757d93245383b0bcf1e1d91b64a99e7331f2345b43beb01f12ee.scope\": RecentStats: unable to find data in memory cache]" Dec 01 16:09:05 crc kubenswrapper[4931]: I1201 16:09:05.521157 4931 generic.go:334] "Generic (PLEG): container finished" podID="57f52646-fe0d-4a22-b3a3-70cf71bc7bf3" containerID="2bf9e2b4823c757d93245383b0bcf1e1d91b64a99e7331f2345b43beb01f12ee" exitCode=0 Dec 01 16:09:05 crc kubenswrapper[4931]: I1201 16:09:05.521276 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsrk/crc-debug-lnfxn" event={"ID":"57f52646-fe0d-4a22-b3a3-70cf71bc7bf3","Type":"ContainerDied","Data":"2bf9e2b4823c757d93245383b0bcf1e1d91b64a99e7331f2345b43beb01f12ee"} Dec 01 16:09:05 crc kubenswrapper[4931]: I1201 16:09:05.970995 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4xsrk/crc-debug-lnfxn"] Dec 01 16:09:05 crc kubenswrapper[4931]: I1201 16:09:05.978095 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4xsrk/crc-debug-lnfxn"] Dec 01 16:09:06 crc kubenswrapper[4931]: I1201 16:09:06.622875 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/crc-debug-lnfxn" Dec 01 16:09:06 crc kubenswrapper[4931]: I1201 16:09:06.805429 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq7bt\" (UniqueName: \"kubernetes.io/projected/57f52646-fe0d-4a22-b3a3-70cf71bc7bf3-kube-api-access-hq7bt\") pod \"57f52646-fe0d-4a22-b3a3-70cf71bc7bf3\" (UID: \"57f52646-fe0d-4a22-b3a3-70cf71bc7bf3\") " Dec 01 16:09:06 crc kubenswrapper[4931]: I1201 16:09:06.805574 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57f52646-fe0d-4a22-b3a3-70cf71bc7bf3-host\") pod \"57f52646-fe0d-4a22-b3a3-70cf71bc7bf3\" (UID: \"57f52646-fe0d-4a22-b3a3-70cf71bc7bf3\") " Dec 01 16:09:06 crc kubenswrapper[4931]: I1201 16:09:06.806233 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57f52646-fe0d-4a22-b3a3-70cf71bc7bf3-host" (OuterVolumeSpecName: "host") pod "57f52646-fe0d-4a22-b3a3-70cf71bc7bf3" (UID: "57f52646-fe0d-4a22-b3a3-70cf71bc7bf3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 16:09:06 crc kubenswrapper[4931]: I1201 16:09:06.811626 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f52646-fe0d-4a22-b3a3-70cf71bc7bf3-kube-api-access-hq7bt" (OuterVolumeSpecName: "kube-api-access-hq7bt") pod "57f52646-fe0d-4a22-b3a3-70cf71bc7bf3" (UID: "57f52646-fe0d-4a22-b3a3-70cf71bc7bf3"). InnerVolumeSpecName "kube-api-access-hq7bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:09:06 crc kubenswrapper[4931]: I1201 16:09:06.907946 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq7bt\" (UniqueName: \"kubernetes.io/projected/57f52646-fe0d-4a22-b3a3-70cf71bc7bf3-kube-api-access-hq7bt\") on node \"crc\" DevicePath \"\"" Dec 01 16:09:06 crc kubenswrapper[4931]: I1201 16:09:06.907996 4931 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57f52646-fe0d-4a22-b3a3-70cf71bc7bf3-host\") on node \"crc\" DevicePath \"\"" Dec 01 16:09:07 crc kubenswrapper[4931]: I1201 16:09:07.298371 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4xsrk/crc-debug-8897m"] Dec 01 16:09:07 crc kubenswrapper[4931]: E1201 16:09:07.298823 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f52646-fe0d-4a22-b3a3-70cf71bc7bf3" containerName="container-00" Dec 01 16:09:07 crc kubenswrapper[4931]: I1201 16:09:07.298839 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f52646-fe0d-4a22-b3a3-70cf71bc7bf3" containerName="container-00" Dec 01 16:09:07 crc kubenswrapper[4931]: I1201 16:09:07.299120 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f52646-fe0d-4a22-b3a3-70cf71bc7bf3" containerName="container-00" Dec 01 16:09:07 crc kubenswrapper[4931]: I1201 16:09:07.299968 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/crc-debug-8897m" Dec 01 16:09:07 crc kubenswrapper[4931]: I1201 16:09:07.418077 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvfj5\" (UniqueName: \"kubernetes.io/projected/308ef49d-0979-4c0f-9b9f-7beca102d5b6-kube-api-access-cvfj5\") pod \"crc-debug-8897m\" (UID: \"308ef49d-0979-4c0f-9b9f-7beca102d5b6\") " pod="openshift-must-gather-4xsrk/crc-debug-8897m" Dec 01 16:09:07 crc kubenswrapper[4931]: I1201 16:09:07.418514 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/308ef49d-0979-4c0f-9b9f-7beca102d5b6-host\") pod \"crc-debug-8897m\" (UID: \"308ef49d-0979-4c0f-9b9f-7beca102d5b6\") " pod="openshift-must-gather-4xsrk/crc-debug-8897m" Dec 01 16:09:07 crc kubenswrapper[4931]: I1201 16:09:07.521131 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvfj5\" (UniqueName: \"kubernetes.io/projected/308ef49d-0979-4c0f-9b9f-7beca102d5b6-kube-api-access-cvfj5\") pod \"crc-debug-8897m\" (UID: \"308ef49d-0979-4c0f-9b9f-7beca102d5b6\") " pod="openshift-must-gather-4xsrk/crc-debug-8897m" Dec 01 16:09:07 crc kubenswrapper[4931]: I1201 16:09:07.521261 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/308ef49d-0979-4c0f-9b9f-7beca102d5b6-host\") pod \"crc-debug-8897m\" (UID: \"308ef49d-0979-4c0f-9b9f-7beca102d5b6\") " pod="openshift-must-gather-4xsrk/crc-debug-8897m" Dec 01 16:09:07 crc kubenswrapper[4931]: I1201 16:09:07.521453 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/308ef49d-0979-4c0f-9b9f-7beca102d5b6-host\") pod \"crc-debug-8897m\" (UID: \"308ef49d-0979-4c0f-9b9f-7beca102d5b6\") " pod="openshift-must-gather-4xsrk/crc-debug-8897m" Dec 01 16:09:07 crc kubenswrapper[4931]: I1201 16:09:07.538893 4931 scope.go:117] "RemoveContainer" containerID="2bf9e2b4823c757d93245383b0bcf1e1d91b64a99e7331f2345b43beb01f12ee" Dec 01 16:09:07 crc kubenswrapper[4931]: I1201 16:09:07.539299 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/crc-debug-lnfxn" Dec 01 16:09:07 crc kubenswrapper[4931]: I1201 16:09:07.548399 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvfj5\" (UniqueName: \"kubernetes.io/projected/308ef49d-0979-4c0f-9b9f-7beca102d5b6-kube-api-access-cvfj5\") pod \"crc-debug-8897m\" (UID: \"308ef49d-0979-4c0f-9b9f-7beca102d5b6\") " pod="openshift-must-gather-4xsrk/crc-debug-8897m" Dec 01 16:09:07 crc kubenswrapper[4931]: I1201 16:09:07.623795 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/crc-debug-8897m" Dec 01 16:09:07 crc kubenswrapper[4931]: W1201 16:09:07.645994 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod308ef49d_0979_4c0f_9b9f_7beca102d5b6.slice/crio-1f78b905bd11b37b0851f908ecedd857ed12c090d32158901bfba52cd930f468 WatchSource:0}: Error finding container 1f78b905bd11b37b0851f908ecedd857ed12c090d32158901bfba52cd930f468: Status 404 returned error can't find the container with id 1f78b905bd11b37b0851f908ecedd857ed12c090d32158901bfba52cd930f468 Dec 01 16:09:08 crc kubenswrapper[4931]: I1201 16:09:08.253550 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f52646-fe0d-4a22-b3a3-70cf71bc7bf3" path="/var/lib/kubelet/pods/57f52646-fe0d-4a22-b3a3-70cf71bc7bf3/volumes" Dec 01 16:09:08 crc kubenswrapper[4931]: I1201 16:09:08.548814 4931 generic.go:334] "Generic (PLEG): container finished" podID="308ef49d-0979-4c0f-9b9f-7beca102d5b6" containerID="95615c6752ee2c1fd32dd071607922c499f156d10addbdb6b5dfbe3c48f0a4d4" exitCode=0 Dec 01 16:09:08 crc kubenswrapper[4931]: I1201 16:09:08.548875 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsrk/crc-debug-8897m" event={"ID":"308ef49d-0979-4c0f-9b9f-7beca102d5b6","Type":"ContainerDied","Data":"95615c6752ee2c1fd32dd071607922c499f156d10addbdb6b5dfbe3c48f0a4d4"} Dec 01 16:09:08 crc kubenswrapper[4931]: I1201 16:09:08.548922 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsrk/crc-debug-8897m" event={"ID":"308ef49d-0979-4c0f-9b9f-7beca102d5b6","Type":"ContainerStarted","Data":"1f78b905bd11b37b0851f908ecedd857ed12c090d32158901bfba52cd930f468"} Dec 01 16:09:08 crc kubenswrapper[4931]: I1201 16:09:08.590301 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4xsrk/crc-debug-8897m"] Dec 01 16:09:08 crc kubenswrapper[4931]: I1201 16:09:08.597530 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4xsrk/crc-debug-8897m"] Dec 01 16:09:09 crc kubenswrapper[4931]: I1201 16:09:09.665984 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/crc-debug-8897m" Dec 01 16:09:09 crc kubenswrapper[4931]: I1201 16:09:09.760299 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvfj5\" (UniqueName: \"kubernetes.io/projected/308ef49d-0979-4c0f-9b9f-7beca102d5b6-kube-api-access-cvfj5\") pod \"308ef49d-0979-4c0f-9b9f-7beca102d5b6\" (UID: \"308ef49d-0979-4c0f-9b9f-7beca102d5b6\") " Dec 01 16:09:09 crc kubenswrapper[4931]: I1201 16:09:09.760415 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/308ef49d-0979-4c0f-9b9f-7beca102d5b6-host\") pod \"308ef49d-0979-4c0f-9b9f-7beca102d5b6\" (UID: \"308ef49d-0979-4c0f-9b9f-7beca102d5b6\") " Dec 01 16:09:09 crc kubenswrapper[4931]: I1201 16:09:09.760670 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/308ef49d-0979-4c0f-9b9f-7beca102d5b6-host" (OuterVolumeSpecName: "host") pod "308ef49d-0979-4c0f-9b9f-7beca102d5b6" (UID: "308ef49d-0979-4c0f-9b9f-7beca102d5b6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 16:09:09 crc kubenswrapper[4931]: I1201 16:09:09.761055 4931 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/308ef49d-0979-4c0f-9b9f-7beca102d5b6-host\") on node \"crc\" DevicePath \"\"" Dec 01 16:09:09 crc kubenswrapper[4931]: I1201 16:09:09.766528 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308ef49d-0979-4c0f-9b9f-7beca102d5b6-kube-api-access-cvfj5" (OuterVolumeSpecName: "kube-api-access-cvfj5") pod "308ef49d-0979-4c0f-9b9f-7beca102d5b6" (UID: "308ef49d-0979-4c0f-9b9f-7beca102d5b6"). InnerVolumeSpecName "kube-api-access-cvfj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:09:09 crc kubenswrapper[4931]: I1201 16:09:09.862272 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvfj5\" (UniqueName: \"kubernetes.io/projected/308ef49d-0979-4c0f-9b9f-7beca102d5b6-kube-api-access-cvfj5\") on node \"crc\" DevicePath \"\"" Dec 01 16:09:10 crc kubenswrapper[4931]: I1201 16:09:10.251005 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308ef49d-0979-4c0f-9b9f-7beca102d5b6" path="/var/lib/kubelet/pods/308ef49d-0979-4c0f-9b9f-7beca102d5b6/volumes" Dec 01 16:09:10 crc kubenswrapper[4931]: I1201 16:09:10.565930 4931 scope.go:117] "RemoveContainer" containerID="95615c6752ee2c1fd32dd071607922c499f156d10addbdb6b5dfbe3c48f0a4d4" Dec 01 16:09:10 crc kubenswrapper[4931]: I1201 16:09:10.566025 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/crc-debug-8897m" Dec 01 16:09:17 crc kubenswrapper[4931]: I1201 16:09:17.243010 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:09:17 crc kubenswrapper[4931]: E1201 16:09:17.243832 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:09:31 crc kubenswrapper[4931]: I1201 16:09:31.242224 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:09:31 crc kubenswrapper[4931]: E1201 16:09:31.244592 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:09:35 crc kubenswrapper[4931]: I1201 16:09:35.973557 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-779b959886-w78q9_93a16952-5c17-428c-b560-4a661b8b5416/barbican-api/0.log" Dec 01 16:09:36 crc kubenswrapper[4931]: I1201 16:09:36.186489 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-779b959886-w78q9_93a16952-5c17-428c-b560-4a661b8b5416/barbican-api-log/0.log" Dec 01 16:09:36 crc kubenswrapper[4931]: I1201 16:09:36.189087 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6765f56b8d-pn88t_05a83efe-a056-4531-b9f3-c6c4f87a9cdb/barbican-keystone-listener/0.log" Dec 01 16:09:36 crc kubenswrapper[4931]: I1201 16:09:36.278640 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6765f56b8d-pn88t_05a83efe-a056-4531-b9f3-c6c4f87a9cdb/barbican-keystone-listener-log/0.log" Dec 01 16:09:36 crc kubenswrapper[4931]: I1201 16:09:36.384859 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76d6fd8967-rrlqd_4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5/barbican-worker/0.log" Dec 01 16:09:36 crc kubenswrapper[4931]: I1201 16:09:36.439771 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76d6fd8967-rrlqd_4f87ed3f-a9e4-4716-880c-f4d0abe0eaa5/barbican-worker-log/0.log" Dec 01 16:09:37 crc kubenswrapper[4931]: I1201 16:09:37.453680 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_de8e3974-4d05-40e2-9306-01dd34663e53/ceilometer-notification-agent/0.log" Dec 01 16:09:37 crc kubenswrapper[4931]: I1201 16:09:37.482363 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_de8e3974-4d05-40e2-9306-01dd34663e53/ceilometer-central-agent/0.log" Dec 01 16:09:37 crc kubenswrapper[4931]: I1201 16:09:37.483924 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4vjgn_b5d464d5-ddf8-4b7f-b1fb-7d65c5edd6f4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:37 crc kubenswrapper[4931]: I1201 16:09:37.778018 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_de8e3974-4d05-40e2-9306-01dd34663e53/proxy-httpd/0.log" Dec 01 16:09:37 crc kubenswrapper[4931]: I1201 16:09:37.843847 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_de8e3974-4d05-40e2-9306-01dd34663e53/sg-core/0.log" Dec 01 16:09:37 crc kubenswrapper[4931]: I1201 16:09:37.907314 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5d91c4a2-739a-4533-a21b-9aa362069d32/cinder-api/0.log" Dec 01 16:09:37 crc kubenswrapper[4931]: I1201 16:09:37.953765 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5d91c4a2-739a-4533-a21b-9aa362069d32/cinder-api-log/0.log" Dec 01 16:09:38 crc kubenswrapper[4931]: I1201 16:09:38.047560 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2cae1c45-3e2d-4df6-93a6-b133953bdce0/cinder-scheduler/0.log" Dec 01 16:09:38 crc kubenswrapper[4931]: I1201 16:09:38.142578 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2cae1c45-3e2d-4df6-93a6-b133953bdce0/probe/0.log" Dec 01 16:09:38 crc kubenswrapper[4931]: I1201 16:09:38.284969 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-65xbv_ff92bfe2-2afc-4cc2-9317-db96b912117c/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:38 crc kubenswrapper[4931]: I1201 16:09:38.373250 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hc2c4_ae337b8b-ad01-493f-9471-aec15d221507/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:38 crc kubenswrapper[4931]: I1201 16:09:38.451661 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-w6hcc_e71b5ae6-3b87-43c4-839b-350df6114a20/init/0.log" Dec 01 16:09:38 crc kubenswrapper[4931]: I1201 16:09:38.713689 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-w6hcc_e71b5ae6-3b87-43c4-839b-350df6114a20/init/0.log" Dec 01 16:09:38 crc kubenswrapper[4931]: I1201 16:09:38.789088 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bwr7d_3c64abd5-bd25-443f-b5f6-ee62c4ad5c0d/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:38 crc kubenswrapper[4931]: I1201 16:09:38.795052 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-w6hcc_e71b5ae6-3b87-43c4-839b-350df6114a20/dnsmasq-dns/0.log" Dec 01 16:09:38 crc kubenswrapper[4931]: I1201 16:09:38.968974 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2316f8ea-3789-4702-91c2-a44da618bb8d/glance-log/0.log" Dec 01 16:09:39 crc kubenswrapper[4931]: I1201 16:09:39.042949 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2316f8ea-3789-4702-91c2-a44da618bb8d/glance-httpd/0.log" Dec 01 16:09:39 crc kubenswrapper[4931]: I1201 16:09:39.063585 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ab67b9e9-4315-4390-b414-89b215ad823b/glance-httpd/0.log" Dec 01 16:09:39 crc kubenswrapper[4931]: I1201 16:09:39.181251 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ab67b9e9-4315-4390-b414-89b215ad823b/glance-log/0.log" Dec 01 16:09:39 crc kubenswrapper[4931]: I1201 16:09:39.342256 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-65c944c654-l6mmj_1a2f9f3b-603b-4004-8e6f-dce5b810785c/horizon/0.log" Dec 01 16:09:39 crc kubenswrapper[4931]: I1201 16:09:39.436867 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-z54c5_624618ca-ac0b-4fcf-bcf8-a7e744c98241/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:39 crc kubenswrapper[4931]: I1201 16:09:39.565770 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-476jq_c09d1f20-7083-4a46-bb55-734481a5d66c/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:39 crc kubenswrapper[4931]: I1201 16:09:39.577882 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-65c944c654-l6mmj_1a2f9f3b-603b-4004-8e6f-dce5b810785c/horizon-log/0.log" Dec 01 16:09:39 crc kubenswrapper[4931]: I1201 16:09:39.754970 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7964d85c7c-w2fwr_407f9337-3fab-42cc-b10d-eada296e7919/keystone-api/0.log" Dec 01 16:09:39 crc kubenswrapper[4931]: I1201 16:09:39.801190 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29410081-j4xz2_736a9048-9855-44b1-aae0-9da840848c45/keystone-cron/0.log" Dec 01 16:09:39 crc kubenswrapper[4931]: I1201 16:09:39.829749 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8aa0aa6b-68f8-4c4f-b192-a967b9ab5cc7/kube-state-metrics/0.log" Dec 01 16:09:39 crc kubenswrapper[4931]: I1201 16:09:39.980729 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vrghr_c016013e-7315-4763-9b4d-0876e4c2068f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:40 crc kubenswrapper[4931]: I1201 16:09:40.296807 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f86b4c4b5-zs55r_38afe8fe-5f17-4be2-84b6-da4211785ed1/neutron-httpd/0.log" Dec 01 16:09:40 crc kubenswrapper[4931]: I1201 16:09:40.333953 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-fqgdv_6225489b-6b7e-40c4-9f3e-e2e28b74d274/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:40 crc kubenswrapper[4931]: I1201 16:09:40.345644 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f86b4c4b5-zs55r_38afe8fe-5f17-4be2-84b6-da4211785ed1/neutron-api/0.log" Dec 01 16:09:40 crc kubenswrapper[4931]: I1201 16:09:40.925468 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2410dbb5-8347-4484-b4dc-6ed9dd32edf7/nova-api-log/0.log" Dec 01 16:09:40 crc kubenswrapper[4931]: I1201 16:09:40.981084 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c084c37d-132d-466d-94d5-8176928d467e/nova-cell0-conductor-conductor/0.log" Dec 01 16:09:41 crc kubenswrapper[4931]: I1201 16:09:41.193078 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2410dbb5-8347-4484-b4dc-6ed9dd32edf7/nova-api-api/0.log" Dec 01 16:09:41 crc kubenswrapper[4931]: I1201 16:09:41.231353 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_dfb9b8d1-45eb-45c7-af4e-64c6c020860d/nova-cell1-conductor-conductor/0.log" Dec 01 16:09:41 crc kubenswrapper[4931]: I1201 16:09:41.514413 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-qbgns_90baac5e-c041-4bd1-bba8-b11e708370e7/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:41 crc kubenswrapper[4931]: I1201 16:09:41.529871 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a5e7833d-55a2-4896-9ca7-610c68157f00/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 16:09:41 crc kubenswrapper[4931]: I1201 16:09:41.707143 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db/nova-metadata-log/0.log" Dec 01 16:09:41 crc kubenswrapper[4931]: I1201 16:09:41.947763 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cf2acf20-9353-4a11-a09c-3d455a247303/mysql-bootstrap/0.log" Dec 01 16:09:42 crc kubenswrapper[4931]: I1201 16:09:42.070260 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_951e2313-b009-42b4-8d52-20a4e3ad6dbf/nova-scheduler-scheduler/0.log" Dec 01 16:09:42 crc kubenswrapper[4931]: I1201 16:09:42.192690 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cf2acf20-9353-4a11-a09c-3d455a247303/mysql-bootstrap/0.log" Dec 01 16:09:42 crc kubenswrapper[4931]: I1201 16:09:42.208988 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cf2acf20-9353-4a11-a09c-3d455a247303/galera/0.log" Dec 01 16:09:42 crc kubenswrapper[4931]: I1201 16:09:42.403257 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa/mysql-bootstrap/0.log" Dec 01 16:09:42 crc kubenswrapper[4931]: I1201 16:09:42.609762 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa/mysql-bootstrap/0.log" Dec 01 16:09:42 crc kubenswrapper[4931]: I1201 16:09:42.619170 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6577a22e-d5d6-4eb2-ad93-61b37d8ac0fa/galera/0.log" Dec 01 16:09:42 crc kubenswrapper[4931]: I1201 16:09:42.819930 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_0ca8ee92-e191-4e9b-aa91-af27342a9fb5/openstackclient/0.log" Dec 01 16:09:42 crc kubenswrapper[4931]: I1201 16:09:42.887558 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xhksj_0476eb16-ff0d-476b-b6da-dd437e123f26/openstack-network-exporter/0.log" Dec 01 16:09:43 crc kubenswrapper[4931]: I1201 16:09:43.075321 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cgg9p_a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24/ovsdb-server-init/0.log" Dec 01 16:09:43 crc kubenswrapper[4931]: I1201 16:09:43.124252 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5493bd5f-c1b6-469e-b0e3-6fe77c4ec2db/nova-metadata-metadata/0.log" Dec 01 16:09:43 crc kubenswrapper[4931]: I1201 16:09:43.241369 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:09:43 crc kubenswrapper[4931]: E1201 16:09:43.241615 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:09:43 crc kubenswrapper[4931]: I1201 16:09:43.243668 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cgg9p_a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24/ovs-vswitchd/0.log" Dec 01 16:09:43 crc kubenswrapper[4931]: I1201 16:09:43.252567 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cgg9p_a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24/ovsdb-server-init/0.log" Dec 01 16:09:43 crc kubenswrapper[4931]: I1201 16:09:43.294731 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cgg9p_a35e17f6-4c8d-4b5d-aea6-5e0dc2000a24/ovsdb-server/0.log" Dec 01 16:09:43 crc kubenswrapper[4931]: I1201 16:09:43.452377 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-v8h85_6f943374-baa7-4200-93ff-6773c58b032d/ovn-controller/0.log" Dec 01 16:09:43 crc kubenswrapper[4931]: I1201 16:09:43.497858 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-btzw5_249f41eb-85d7-46f2-80d3-5f1ea0dcbda7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:43 crc kubenswrapper[4931]: I1201 16:09:43.638988 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47/openstack-network-exporter/0.log" Dec 01 16:09:43 crc kubenswrapper[4931]: I1201 16:09:43.713187 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ed8590cd-4bfa-42cc-b5c3-8fb6f538aa47/ovn-northd/0.log" Dec 01 16:09:43 crc kubenswrapper[4931]: I1201 16:09:43.825332 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d08b55bc-d817-44c5-81f3-6581236b50c1/openstack-network-exporter/0.log" Dec 01 16:09:43 crc kubenswrapper[4931]: I1201 16:09:43.888231 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d08b55bc-d817-44c5-81f3-6581236b50c1/ovsdbserver-nb/0.log" Dec 01 16:09:44 crc kubenswrapper[4931]: I1201 16:09:44.035545 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4f5fc34b-28c1-4e76-8de7-aa52db803802/openstack-network-exporter/0.log" Dec 01 16:09:44 crc kubenswrapper[4931]: I1201 16:09:44.041352 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4f5fc34b-28c1-4e76-8de7-aa52db803802/ovsdbserver-sb/0.log" Dec 01 16:09:44 crc kubenswrapper[4931]: I1201 16:09:44.990857 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b364f6e4-552e-435c-b684-d6ebbc851ef2/setup-container/0.log" Dec 01 16:09:45 crc kubenswrapper[4931]: I1201 16:09:45.089529 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c9d84d99b-fj4vs_ae11272a-fb06-4ea3-8ab4-64a667d9cdd9/placement-log/0.log" Dec 01 16:09:45 crc kubenswrapper[4931]: I1201 16:09:45.112399 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6c9d84d99b-fj4vs_ae11272a-fb06-4ea3-8ab4-64a667d9cdd9/placement-api/0.log" Dec 01 16:09:45 crc kubenswrapper[4931]: I1201 16:09:45.197300 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b364f6e4-552e-435c-b684-d6ebbc851ef2/setup-container/0.log" Dec 01 16:09:45 crc kubenswrapper[4931]: I1201 16:09:45.310867 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bc2ff309-c81a-4d19-bfb0-99a4a975b70a/setup-container/0.log" Dec 01 16:09:45 crc kubenswrapper[4931]: I1201 16:09:45.324136 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b364f6e4-552e-435c-b684-d6ebbc851ef2/rabbitmq/0.log" Dec 01 16:09:45 crc kubenswrapper[4931]: I1201 16:09:45.570323 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bc2ff309-c81a-4d19-bfb0-99a4a975b70a/rabbitmq/0.log" Dec 01 16:09:45 crc kubenswrapper[4931]: I1201 16:09:45.591494 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bc2ff309-c81a-4d19-bfb0-99a4a975b70a/setup-container/0.log" Dec 01 16:09:45 crc kubenswrapper[4931]: I1201 16:09:45.634508 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-qp8v4_616e2b9d-1da1-44fd-98c3-7f8cbc8d686c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:46 crc kubenswrapper[4931]: I1201 16:09:46.312083 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zxmwj_c44015a7-22fa-4746-8536-2f7c70888a5d/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:46 crc kubenswrapper[4931]: I1201 16:09:46.327808 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-wx9bv_000355db-6a0a-46ab-8e8e-1040c775de8a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:46 crc kubenswrapper[4931]: I1201 16:09:46.532843 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-kqxsm_def3b4c2-cbcc-4aae-a750-6c96cd0d8f67/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:46 crc kubenswrapper[4931]: I1201 16:09:46.623348 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-f4bm9_04b2b3de-fd72-45fc-9d34-2d5dfb1ce4fc/ssh-known-hosts-edpm-deployment/0.log" Dec 01 16:09:46 crc kubenswrapper[4931]: I1201 16:09:46.851297 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7465544595-sc668_2b1d2c6e-39e6-438c-98e8-be76bfa71050/proxy-server/0.log" Dec 01 16:09:46 crc kubenswrapper[4931]: I1201 16:09:46.950865 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7465544595-sc668_2b1d2c6e-39e6-438c-98e8-be76bfa71050/proxy-httpd/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.001066 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-lbvg4_ec217c21-8698-4dd3-a58d-b2626cbfbaf2/swift-ring-rebalance/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.048902 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/account-auditor/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.155313 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/account-reaper/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.212270 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/account-replicator/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.296958 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/account-server/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.330710 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/container-auditor/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.458373 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/container-server/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.465270 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/container-replicator/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.538752 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/container-updater/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.539520 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/object-auditor/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.654993 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/object-expirer/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.699259 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/object-replicator/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.721250 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/object-server/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.723618 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/object-updater/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.836325 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/rsync/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.891235 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_fe036b57-6753-42af-ad39-195f0688532d/swift-recon-cron/0.log" Dec 01 16:09:47 crc kubenswrapper[4931]: I1201 16:09:47.985904 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-2xthl_d89048c3-b9a3-4274-8d12-9543d8a29503/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:48 crc kubenswrapper[4931]: I1201 16:09:48.088966 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e384c534-76cd-4296-9318-aaf007e87661/tempest-tests-tempest-tests-runner/0.log" Dec 01 16:09:48 crc kubenswrapper[4931]: I1201 16:09:48.266460 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_b0a9c963-758f-4d4d-a9c0-e148a5733bf9/test-operator-logs-container/0.log" Dec 01 16:09:48 crc kubenswrapper[4931]: I1201 16:09:48.405363 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-8qzr8_e4bfaa73-8d90-4e34-99a7-2e0f70ddadc5/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 16:09:55 crc kubenswrapper[4931]: I1201 16:09:55.241572 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:09:55 crc kubenswrapper[4931]: I1201 16:09:55.993580 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"9a361845d1da44cb7b29595626124fca03becd0f451d39200eda9b78eee72f1b"} Dec 01 16:09:57 crc kubenswrapper[4931]: I1201 16:09:57.701089 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_474458a3-5f29-4735-bed5-96f2f1d6e352/memcached/0.log" Dec 01 16:10:15 crc kubenswrapper[4931]: I1201 16:10:15.269044 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/util/0.log" Dec 01 16:10:15 crc kubenswrapper[4931]: I1201 16:10:15.478933 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/util/0.log" Dec 01 16:10:15 crc kubenswrapper[4931]: I1201 16:10:15.488073 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/pull/0.log" Dec 01 16:10:15 crc kubenswrapper[4931]: I1201 16:10:15.534707 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/pull/0.log" Dec 01 16:10:15 crc kubenswrapper[4931]: I1201 16:10:15.696688 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/util/0.log" Dec 01 16:10:15 crc kubenswrapper[4931]: I1201 16:10:15.696918 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/extract/0.log" Dec 01 16:10:15 crc kubenswrapper[4931]: I1201 16:10:15.713414 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9d8a48dc3f9662ac3564cf1fbbc1c83727c494fbc7abe3aa11036ffb68xf9n2_f08eaf36-78a8-4183-b663-22eaefe5cb6b/pull/0.log" Dec 01 16:10:15 crc kubenswrapper[4931]: I1201 16:10:15.874742 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-wcntp_71d6d824-1cf3-4984-b8de-f10d19192a5f/kube-rbac-proxy/0.log" Dec 01 16:10:15 crc kubenswrapper[4931]: I1201 16:10:15.923433 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-lr4mz_fff7f112-af9f-42e3-beef-e0efdcf602c9/kube-rbac-proxy/0.log" Dec 01 16:10:15 crc kubenswrapper[4931]: I1201 16:10:15.925158 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-wcntp_71d6d824-1cf3-4984-b8de-f10d19192a5f/manager/0.log" Dec 01 16:10:16 crc kubenswrapper[4931]: I1201 16:10:16.109560 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-g2jqf_8258a972-ead1-4bee-ae4f-cba90b238dde/kube-rbac-proxy/0.log" Dec 01 16:10:16 crc kubenswrapper[4931]: I1201 16:10:16.135326 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-g2jqf_8258a972-ead1-4bee-ae4f-cba90b238dde/manager/0.log" Dec 01 16:10:16 crc kubenswrapper[4931]: I1201 16:10:16.151043 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-lr4mz_fff7f112-af9f-42e3-beef-e0efdcf602c9/manager/0.log" Dec 01 16:10:16 crc kubenswrapper[4931]: I1201 16:10:16.297467 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-g8vfm_006dbf05-46c7-4348-a9bb-74a7c56fd3fd/kube-rbac-proxy/0.log" Dec 01 16:10:16 crc kubenswrapper[4931]: I1201 16:10:16.392588 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-g8vfm_006dbf05-46c7-4348-a9bb-74a7c56fd3fd/manager/0.log" Dec 01 16:10:16 crc kubenswrapper[4931]: I1201 16:10:16.468436 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-2fp96_921c2e7b-0f37-4e93-ab2e-76a23e146d28/kube-rbac-proxy/0.log" Dec 01 16:10:16 crc kubenswrapper[4931]: I1201 16:10:16.486135 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-2fp96_921c2e7b-0f37-4e93-ab2e-76a23e146d28/manager/0.log" Dec 01 16:10:16 crc kubenswrapper[4931]: I1201 16:10:16.597990 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-6x5bb_6b36c7c7-1886-476e-b0d8-50168c04ff83/kube-rbac-proxy/0.log" Dec 01 16:10:16 crc kubenswrapper[4931]: I1201 16:10:16.687970 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-6x5bb_6b36c7c7-1886-476e-b0d8-50168c04ff83/manager/0.log" Dec 01 16:10:16 crc kubenswrapper[4931]: I1201 16:10:16.772598 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-sk9bq_371eef0f-aae7-40bd-9c47-1ffd0e77e08d/kube-rbac-proxy/0.log" Dec 01 16:10:16 crc kubenswrapper[4931]: I1201 16:10:16.910221 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-gzpgd_e20c64cb-88d5-4ffe-bb88-8715010ccf33/kube-rbac-proxy/0.log" Dec 01 16:10:16 crc kubenswrapper[4931]: I1201 16:10:16.941903 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-sk9bq_371eef0f-aae7-40bd-9c47-1ffd0e77e08d/manager/0.log" Dec 01 16:10:16 crc kubenswrapper[4931]: I1201 16:10:16.977341 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-gzpgd_e20c64cb-88d5-4ffe-bb88-8715010ccf33/manager/0.log" Dec 01 16:10:17 crc kubenswrapper[4931]: I1201 16:10:17.086121 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-l24dv_56ca76b6-8e16-4d73-9e78-f20e046738fc/kube-rbac-proxy/0.log" Dec 01 16:10:17 crc kubenswrapper[4931]: I1201 16:10:17.152373 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-l24dv_56ca76b6-8e16-4d73-9e78-f20e046738fc/manager/0.log" Dec 01 16:10:17 crc kubenswrapper[4931]: I1201 16:10:17.284722 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-bpqmb_1c0a3def-dea0-40f3-8368-a36f6030f7f7/manager/0.log" Dec 01 16:10:17 crc kubenswrapper[4931]: I1201 16:10:17.286335 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-bpqmb_1c0a3def-dea0-40f3-8368-a36f6030f7f7/kube-rbac-proxy/0.log" Dec 01 16:10:17 crc kubenswrapper[4931]: I1201 16:10:17.395377 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ms6t7_b30439fb-2d71-4c3b-97ec-5e304c1eb15e/kube-rbac-proxy/0.log" Dec 01 16:10:17 crc kubenswrapper[4931]: I1201 16:10:17.545854 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-jvkpz_e32d4db7-aa36-4724-a113-2f7ff2af254d/kube-rbac-proxy/0.log" Dec 01 16:10:17 crc kubenswrapper[4931]: I1201 16:10:17.558314 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ms6t7_b30439fb-2d71-4c3b-97ec-5e304c1eb15e/manager/0.log" Dec 01 16:10:17 crc kubenswrapper[4931]: I1201 16:10:17.660286 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-jvkpz_e32d4db7-aa36-4724-a113-2f7ff2af254d/manager/0.log" Dec 01 16:10:17 crc kubenswrapper[4931]: I1201 16:10:17.742398 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-h5c6g_3dc1d698-63f8-4e4b-8e28-6e128b5b46da/kube-rbac-proxy/0.log" Dec 01 16:10:17 crc kubenswrapper[4931]: I1201 16:10:17.776553 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-h5c6g_3dc1d698-63f8-4e4b-8e28-6e128b5b46da/manager/0.log" Dec 01 16:10:17 crc kubenswrapper[4931]: I1201 16:10:17.892732 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mtpcz_e45f587b-3120-4b16-9c9a-66bc5c1252aa/kube-rbac-proxy/0.log" Dec 01 16:10:17 crc kubenswrapper[4931]: I1201 16:10:17.914911 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mtpcz_e45f587b-3120-4b16-9c9a-66bc5c1252aa/manager/0.log" Dec 01 16:10:18 crc kubenswrapper[4931]: I1201 16:10:18.069325 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc_b366d143-b330-4141-be76-b87796b94301/manager/0.log" Dec 01 16:10:18 crc kubenswrapper[4931]: I1201 16:10:18.073858 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4t5cwc_b366d143-b330-4141-be76-b87796b94301/kube-rbac-proxy/0.log" Dec 01 16:10:18 crc kubenswrapper[4931]: I1201 16:10:18.337343 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-72slc_6fff8f62-0a00-45e5-9e66-bd92dff14023/registry-server/0.log" Dec 01 16:10:18 crc kubenswrapper[4931]: I1201 16:10:18.462842 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-g9s8h_31166fda-e2fe-4a4a-9717-550172ed4093/kube-rbac-proxy/0.log" Dec 01 16:10:18 crc kubenswrapper[4931]: I1201 16:10:18.463206 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-654ffbd64b-qsbsr_a34fc488-c895-4f50-9164-ced702fcf61d/operator/0.log" Dec 01 16:10:18 crc kubenswrapper[4931]: I1201 16:10:18.604688 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-g9s8h_31166fda-e2fe-4a4a-9717-550172ed4093/manager/0.log" Dec 01 16:10:18 crc kubenswrapper[4931]: I1201 16:10:18.781349 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-n4w4t_a35a63a2-123b-45f4-99fe-4a7baece61be/kube-rbac-proxy/0.log" Dec 01 16:10:18 crc kubenswrapper[4931]: I1201 16:10:18.826162 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-n4w4t_a35a63a2-123b-45f4-99fe-4a7baece61be/manager/0.log" Dec 01 16:10:18 crc kubenswrapper[4931]: I1201 16:10:18.947082 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-zbd9m_cc03b8f7-9a1d-4f98-a70f-6da587e8d1d3/operator/0.log" Dec 01 16:10:19 crc kubenswrapper[4931]: I1201 16:10:19.117307 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-vv2tp_12e80a83-7d9f-417b-934e-83d23085f11b/kube-rbac-proxy/0.log" Dec 01 16:10:19 crc kubenswrapper[4931]: I1201 16:10:19.183842 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-vv2tp_12e80a83-7d9f-417b-934e-83d23085f11b/manager/0.log" Dec 01 16:10:19 crc kubenswrapper[4931]: I1201 16:10:19.218892 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-rv87n_39f1253b-afe7-47b1-8c68-2a36d49f969b/kube-rbac-proxy/0.log" Dec 01 16:10:19 crc kubenswrapper[4931]: I1201 16:10:19.285448 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-95d9848f7-bjcxf_803870f9-7602-4eae-ba61-09e7aa4c63bb/manager/0.log" Dec 01 16:10:19 crc kubenswrapper[4931]: I1201 16:10:19.362619 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-lsxmf_75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae/kube-rbac-proxy/0.log" Dec 01 16:10:19 crc kubenswrapper[4931]: I1201 16:10:19.363267 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-rv87n_39f1253b-afe7-47b1-8c68-2a36d49f969b/manager/0.log" Dec 01 16:10:19 crc kubenswrapper[4931]: I1201 16:10:19.418313 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-lsxmf_75f2a900-ed9f-4f28-bb9f-3cb4e449f6ae/manager/0.log" Dec 01 16:10:19 crc kubenswrapper[4931]: I1201 16:10:19.532070 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-6kbkb_18b9fb30-a34f-42ad-9692-84c532f586d6/kube-rbac-proxy/0.log" Dec 01 16:10:19 crc kubenswrapper[4931]: I1201 16:10:19.553728 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-6kbkb_18b9fb30-a34f-42ad-9692-84c532f586d6/manager/0.log" Dec 01 16:10:39 crc kubenswrapper[4931]: I1201 16:10:39.038674 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-nx8gh_0216ff96-a3b4-4486-91ab-f73485d18134/control-plane-machine-set-operator/0.log" Dec 01 16:10:39 crc kubenswrapper[4931]: I1201 16:10:39.177813 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mbql2_0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7/kube-rbac-proxy/0.log" Dec 01 16:10:39 crc kubenswrapper[4931]: I1201 16:10:39.216928 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mbql2_0b4ba470-3e6c-4f92-8e7d-552bfb2e38f7/machine-api-operator/0.log" Dec 01 16:10:52 crc kubenswrapper[4931]: I1201 16:10:52.940433 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-lgwml_6057bc63-1367-4776-b9ab-2750c34f017d/cert-manager-controller/0.log" Dec 01 16:10:53 crc kubenswrapper[4931]: I1201 16:10:53.053221 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-d8s9l_75a5d47d-0c63-48dc-b439-8b69b82c29ed/cert-manager-cainjector/0.log" Dec 01 16:10:53 crc kubenswrapper[4931]: I1201 16:10:53.110809 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-6jglv_c0dce0ee-05c3-42a7-a599-5bff0b0416ee/cert-manager-webhook/0.log" Dec 01 16:11:07 crc kubenswrapper[4931]: I1201 16:11:07.138861 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-lt2nw_a5b7cbc9-6afb-414c-ac6e-569b6d9634ed/nmstate-console-plugin/0.log" Dec 01 16:11:07 crc kubenswrapper[4931]: I1201 16:11:07.289745 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8ljjr_fd514864-e79c-4cd7-9517-0b3f9fbee078/nmstate-handler/0.log" Dec 01 16:11:07 crc kubenswrapper[4931]: I1201 16:11:07.340255 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-4xwr7_8be2b09a-fed1-4f7f-9424-5ca00a814c3d/nmstate-metrics/0.log" Dec 01 16:11:07 crc kubenswrapper[4931]: I1201 16:11:07.346148 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-4xwr7_8be2b09a-fed1-4f7f-9424-5ca00a814c3d/kube-rbac-proxy/0.log" Dec 01 16:11:07 crc kubenswrapper[4931]: I1201 16:11:07.529034 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-h5j95_915c495c-c7fa-4c00-ad7e-03d0e7ba74c9/nmstate-operator/0.log" Dec 01 16:11:07 crc kubenswrapper[4931]: I1201 16:11:07.539351 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-5xlg8_f821ad14-f34f-49cb-a884-905cf0219454/nmstate-webhook/0.log" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.185102 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-24kpr"] Dec 01 16:11:21 crc kubenswrapper[4931]: E1201 16:11:21.186034 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="308ef49d-0979-4c0f-9b9f-7beca102d5b6" containerName="container-00" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.186051 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="308ef49d-0979-4c0f-9b9f-7beca102d5b6" containerName="container-00" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.186245 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="308ef49d-0979-4c0f-9b9f-7beca102d5b6" containerName="container-00" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.192225 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.196912 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-24kpr"] Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.385697 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfc8849c-0fe1-4e84-8776-8008232effc4-utilities\") pod \"community-operators-24kpr\" (UID: \"cfc8849c-0fe1-4e84-8776-8008232effc4\") " pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.385763 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4lgs\" (UniqueName: \"kubernetes.io/projected/cfc8849c-0fe1-4e84-8776-8008232effc4-kube-api-access-q4lgs\") pod \"community-operators-24kpr\" (UID: \"cfc8849c-0fe1-4e84-8776-8008232effc4\") " pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.385843 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfc8849c-0fe1-4e84-8776-8008232effc4-catalog-content\") pod \"community-operators-24kpr\" (UID: \"cfc8849c-0fe1-4e84-8776-8008232effc4\") " pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.487227 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfc8849c-0fe1-4e84-8776-8008232effc4-utilities\") pod \"community-operators-24kpr\" (UID: \"cfc8849c-0fe1-4e84-8776-8008232effc4\") " pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.487267 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4lgs\" (UniqueName: \"kubernetes.io/projected/cfc8849c-0fe1-4e84-8776-8008232effc4-kube-api-access-q4lgs\") pod \"community-operators-24kpr\" (UID: \"cfc8849c-0fe1-4e84-8776-8008232effc4\") " pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.487304 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfc8849c-0fe1-4e84-8776-8008232effc4-catalog-content\") pod \"community-operators-24kpr\" (UID: \"cfc8849c-0fe1-4e84-8776-8008232effc4\") " pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.487840 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfc8849c-0fe1-4e84-8776-8008232effc4-catalog-content\") pod \"community-operators-24kpr\" (UID: \"cfc8849c-0fe1-4e84-8776-8008232effc4\") " pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.488060 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfc8849c-0fe1-4e84-8776-8008232effc4-utilities\") pod \"community-operators-24kpr\" (UID: \"cfc8849c-0fe1-4e84-8776-8008232effc4\") " pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.508146 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4lgs\" (UniqueName: \"kubernetes.io/projected/cfc8849c-0fe1-4e84-8776-8008232effc4-kube-api-access-q4lgs\") pod \"community-operators-24kpr\" (UID: \"cfc8849c-0fe1-4e84-8776-8008232effc4\") " pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:21 crc kubenswrapper[4931]: I1201 16:11:21.532559 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.064613 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-24kpr"] Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.260709 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-jcr8j_d143b724-552d-4530-844e-c8b752b2ffa3/kube-rbac-proxy/0.log" Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.356472 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-jcr8j_d143b724-552d-4530-844e-c8b752b2ffa3/controller/0.log" Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.486010 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-frr-files/0.log" Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.647355 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-reloader/0.log" Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.668000 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-reloader/0.log" Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.687352 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-frr-files/0.log" Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.715953 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-metrics/0.log" Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.804643 4931 generic.go:334] "Generic (PLEG): container finished" podID="cfc8849c-0fe1-4e84-8776-8008232effc4" containerID="c5bb7650b9dfd22200f80dd748d2f4479a43fba290508efe71cf7d1fd2fbc934" exitCode=0 Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.804858 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24kpr" event={"ID":"cfc8849c-0fe1-4e84-8776-8008232effc4","Type":"ContainerDied","Data":"c5bb7650b9dfd22200f80dd748d2f4479a43fba290508efe71cf7d1fd2fbc934"} Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.805014 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24kpr" event={"ID":"cfc8849c-0fe1-4e84-8776-8008232effc4","Type":"ContainerStarted","Data":"901d7f799e49d1ed1df9a17c3f900b00b7441b245fdc5f543383403289135fb0"} Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.807277 4931 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.857194 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-frr-files/0.log" Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.859372 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-reloader/0.log" Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.878739 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-metrics/0.log" Dec 01 16:11:22 crc kubenswrapper[4931]: I1201 16:11:22.944928 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-metrics/0.log" Dec 01 16:11:23 crc kubenswrapper[4931]: I1201 16:11:23.077627 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-frr-files/0.log" Dec 01 16:11:23 crc kubenswrapper[4931]: I1201 16:11:23.087143 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-metrics/0.log" Dec 01 16:11:23 crc kubenswrapper[4931]: I1201 16:11:23.103047 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/cp-reloader/0.log" Dec 01 16:11:23 crc kubenswrapper[4931]: I1201 16:11:23.110519 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/controller/0.log" Dec 01 16:11:23 crc kubenswrapper[4931]: I1201 16:11:23.294988 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/kube-rbac-proxy/0.log" Dec 01 16:11:23 crc kubenswrapper[4931]: I1201 16:11:23.305425 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/frr-metrics/0.log" Dec 01 16:11:23 crc kubenswrapper[4931]: I1201 16:11:23.309674 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/kube-rbac-proxy-frr/0.log" Dec 01 16:11:23 crc kubenswrapper[4931]: I1201 16:11:23.447573 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/reloader/0.log" Dec 01 16:11:23 crc kubenswrapper[4931]: I1201 16:11:23.508312 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-rj9kg_84cf14c6-6a50-4863-98a4-caab8b1f5636/frr-k8s-webhook-server/0.log" Dec 01 16:11:23 crc kubenswrapper[4931]: I1201 16:11:23.822518 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24kpr" event={"ID":"cfc8849c-0fe1-4e84-8776-8008232effc4","Type":"ContainerStarted","Data":"aacb4d88188dd10f012c66921880547b4cc9c123d3f21c45861b724659eff43c"} Dec 01 16:11:23 crc kubenswrapper[4931]: I1201 16:11:23.846420 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-549c5b5d67-wrmzh_14764be0-1d4c-47d3-9d5e-3682a7857d04/manager/0.log" Dec 01 16:11:23 crc kubenswrapper[4931]: I1201 16:11:23.943752 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-dbbf79d98-cbdx9_63d2099f-ba55-4f55-97ac-bca52404a30a/webhook-server/0.log" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.043459 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qb2rl_d427cc71-929e-42ae-bc96-87360ba8c005/kube-rbac-proxy/0.log" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.389651 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2mxt4"] Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.391599 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.427642 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2mxt4"] Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.530845 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00ee5537-c78c-4102-888b-909925be1a82-utilities\") pod \"certified-operators-2mxt4\" (UID: \"00ee5537-c78c-4102-888b-909925be1a82\") " pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.530895 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phjxt\" (UniqueName: \"kubernetes.io/projected/00ee5537-c78c-4102-888b-909925be1a82-kube-api-access-phjxt\") pod \"certified-operators-2mxt4\" (UID: \"00ee5537-c78c-4102-888b-909925be1a82\") " pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.530939 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00ee5537-c78c-4102-888b-909925be1a82-catalog-content\") pod \"certified-operators-2mxt4\" (UID: \"00ee5537-c78c-4102-888b-909925be1a82\") " pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.591199 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qb2rl_d427cc71-929e-42ae-bc96-87360ba8c005/speaker/0.log" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.595783 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ch7gc_f1e866ab-084b-436d-86bc-97b7a45e8515/frr/0.log" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.632926 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phjxt\" (UniqueName: \"kubernetes.io/projected/00ee5537-c78c-4102-888b-909925be1a82-kube-api-access-phjxt\") pod \"certified-operators-2mxt4\" (UID: \"00ee5537-c78c-4102-888b-909925be1a82\") " pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.632994 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00ee5537-c78c-4102-888b-909925be1a82-catalog-content\") pod \"certified-operators-2mxt4\" (UID: \"00ee5537-c78c-4102-888b-909925be1a82\") " pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.633124 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00ee5537-c78c-4102-888b-909925be1a82-utilities\") pod \"certified-operators-2mxt4\" (UID: \"00ee5537-c78c-4102-888b-909925be1a82\") " pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.633571 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00ee5537-c78c-4102-888b-909925be1a82-utilities\") pod \"certified-operators-2mxt4\" (UID: \"00ee5537-c78c-4102-888b-909925be1a82\") " pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.634051 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00ee5537-c78c-4102-888b-909925be1a82-catalog-content\") pod \"certified-operators-2mxt4\" (UID: \"00ee5537-c78c-4102-888b-909925be1a82\") " pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.664136 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phjxt\" (UniqueName: \"kubernetes.io/projected/00ee5537-c78c-4102-888b-909925be1a82-kube-api-access-phjxt\") pod \"certified-operators-2mxt4\" (UID: \"00ee5537-c78c-4102-888b-909925be1a82\") " pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.710854 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.855089 4931 generic.go:334] "Generic (PLEG): container finished" podID="cfc8849c-0fe1-4e84-8776-8008232effc4" containerID="aacb4d88188dd10f012c66921880547b4cc9c123d3f21c45861b724659eff43c" exitCode=0 Dec 01 16:11:24 crc kubenswrapper[4931]: I1201 16:11:24.855133 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24kpr" event={"ID":"cfc8849c-0fe1-4e84-8776-8008232effc4","Type":"ContainerDied","Data":"aacb4d88188dd10f012c66921880547b4cc9c123d3f21c45861b724659eff43c"} Dec 01 16:11:25 crc kubenswrapper[4931]: I1201 16:11:25.277484 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2mxt4"] Dec 01 16:11:25 crc kubenswrapper[4931]: W1201 16:11:25.619722 4931 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00ee5537_c78c_4102_888b_909925be1a82.slice/crio-ad123251916acb710f24a51d6841aef64cb255390825abd86c2d651cf3664c04 WatchSource:0}: Error finding container ad123251916acb710f24a51d6841aef64cb255390825abd86c2d651cf3664c04: Status 404 returned error can't find the container with id ad123251916acb710f24a51d6841aef64cb255390825abd86c2d651cf3664c04 Dec 01 16:11:25 crc kubenswrapper[4931]: I1201 16:11:25.866592 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mxt4" event={"ID":"00ee5537-c78c-4102-888b-909925be1a82","Type":"ContainerStarted","Data":"b8d11b4bf7fbb1c18ba75f6a37e9610147994f7f8b6270a77866bdf713055294"} Dec 01 16:11:25 crc kubenswrapper[4931]: I1201 16:11:25.866919 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mxt4" event={"ID":"00ee5537-c78c-4102-888b-909925be1a82","Type":"ContainerStarted","Data":"ad123251916acb710f24a51d6841aef64cb255390825abd86c2d651cf3664c04"} Dec 01 16:11:26 crc kubenswrapper[4931]: I1201 16:11:26.875869 4931 generic.go:334] "Generic (PLEG): container finished" podID="00ee5537-c78c-4102-888b-909925be1a82" containerID="b8d11b4bf7fbb1c18ba75f6a37e9610147994f7f8b6270a77866bdf713055294" exitCode=0 Dec 01 16:11:26 crc kubenswrapper[4931]: I1201 16:11:26.876156 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mxt4" event={"ID":"00ee5537-c78c-4102-888b-909925be1a82","Type":"ContainerDied","Data":"b8d11b4bf7fbb1c18ba75f6a37e9610147994f7f8b6270a77866bdf713055294"} Dec 01 16:11:26 crc kubenswrapper[4931]: I1201 16:11:26.878341 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24kpr" event={"ID":"cfc8849c-0fe1-4e84-8776-8008232effc4","Type":"ContainerStarted","Data":"8fe28b4663ff5478b6378decb5bd3613fc923e7e0f202a19e459f04949029209"} Dec 01 16:11:26 crc kubenswrapper[4931]: I1201 16:11:26.924352 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-24kpr" podStartSLOduration=2.744750791 podStartE2EDuration="5.924331398s" podCreationTimestamp="2025-12-01 16:11:21 +0000 UTC" firstStartedPulling="2025-12-01 16:11:22.807042795 +0000 UTC m=+4229.232916462" lastFinishedPulling="2025-12-01 16:11:25.986623402 +0000 UTC m=+4232.412497069" observedRunningTime="2025-12-01 16:11:26.917211407 +0000 UTC m=+4233.343085084" watchObservedRunningTime="2025-12-01 16:11:26.924331398 +0000 UTC m=+4233.350205055" Dec 01 16:11:28 crc kubenswrapper[4931]: I1201 16:11:28.907841 4931 generic.go:334] "Generic (PLEG): container finished" podID="00ee5537-c78c-4102-888b-909925be1a82" containerID="bfae6d36cfe640c8440eb11c207a6063955c9415798a1c5052bf2f01fc3558d2" exitCode=0 Dec 01 16:11:28 crc kubenswrapper[4931]: I1201 16:11:28.908487 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mxt4" event={"ID":"00ee5537-c78c-4102-888b-909925be1a82","Type":"ContainerDied","Data":"bfae6d36cfe640c8440eb11c207a6063955c9415798a1c5052bf2f01fc3558d2"} Dec 01 16:11:30 crc kubenswrapper[4931]: I1201 16:11:30.954552 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mxt4" event={"ID":"00ee5537-c78c-4102-888b-909925be1a82","Type":"ContainerStarted","Data":"7454414de8ea89c22d03a697e4fbc0408ac096ee5cc9d9501aea1b7ade55b7b6"} Dec 01 16:11:30 crc kubenswrapper[4931]: I1201 16:11:30.980636 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2mxt4" podStartSLOduration=3.989700369 podStartE2EDuration="6.980612182s" podCreationTimestamp="2025-12-01 16:11:24 +0000 UTC" firstStartedPulling="2025-12-01 16:11:26.879866376 +0000 UTC m=+4233.305740043" lastFinishedPulling="2025-12-01 16:11:29.870778189 +0000 UTC m=+4236.296651856" observedRunningTime="2025-12-01 16:11:30.974320985 +0000 UTC m=+4237.400194652" watchObservedRunningTime="2025-12-01 16:11:30.980612182 +0000 UTC m=+4237.406485849" Dec 01 16:11:31 crc kubenswrapper[4931]: I1201 16:11:31.533770 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:31 crc kubenswrapper[4931]: I1201 16:11:31.534462 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:31 crc kubenswrapper[4931]: I1201 16:11:31.588164 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:32 crc kubenswrapper[4931]: I1201 16:11:32.032118 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:32 crc kubenswrapper[4931]: I1201 16:11:32.977610 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-24kpr"] Dec 01 16:11:34 crc kubenswrapper[4931]: I1201 16:11:34.711500 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:34 crc kubenswrapper[4931]: I1201 16:11:34.711907 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:34 crc kubenswrapper[4931]: I1201 16:11:34.776270 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:34 crc kubenswrapper[4931]: I1201 16:11:34.991945 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-24kpr" podUID="cfc8849c-0fe1-4e84-8776-8008232effc4" containerName="registry-server" containerID="cri-o://8fe28b4663ff5478b6378decb5bd3613fc923e7e0f202a19e459f04949029209" gracePeriod=2 Dec 01 16:11:35 crc kubenswrapper[4931]: I1201 16:11:35.457062 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:35 crc kubenswrapper[4931]: I1201 16:11:35.550357 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfc8849c-0fe1-4e84-8776-8008232effc4-utilities\") pod \"cfc8849c-0fe1-4e84-8776-8008232effc4\" (UID: \"cfc8849c-0fe1-4e84-8776-8008232effc4\") " Dec 01 16:11:35 crc kubenswrapper[4931]: I1201 16:11:35.550755 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfc8849c-0fe1-4e84-8776-8008232effc4-catalog-content\") pod \"cfc8849c-0fe1-4e84-8776-8008232effc4\" (UID: \"cfc8849c-0fe1-4e84-8776-8008232effc4\") " Dec 01 16:11:35 crc kubenswrapper[4931]: I1201 16:11:35.550806 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4lgs\" (UniqueName: \"kubernetes.io/projected/cfc8849c-0fe1-4e84-8776-8008232effc4-kube-api-access-q4lgs\") pod \"cfc8849c-0fe1-4e84-8776-8008232effc4\" (UID: \"cfc8849c-0fe1-4e84-8776-8008232effc4\") " Dec 01 16:11:35 crc kubenswrapper[4931]: I1201 16:11:35.551508 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfc8849c-0fe1-4e84-8776-8008232effc4-utilities" (OuterVolumeSpecName: "utilities") pod "cfc8849c-0fe1-4e84-8776-8008232effc4" (UID: "cfc8849c-0fe1-4e84-8776-8008232effc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:11:35 crc kubenswrapper[4931]: I1201 16:11:35.557527 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc8849c-0fe1-4e84-8776-8008232effc4-kube-api-access-q4lgs" (OuterVolumeSpecName: "kube-api-access-q4lgs") pod "cfc8849c-0fe1-4e84-8776-8008232effc4" (UID: "cfc8849c-0fe1-4e84-8776-8008232effc4"). InnerVolumeSpecName "kube-api-access-q4lgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:11:35 crc kubenswrapper[4931]: I1201 16:11:35.601850 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfc8849c-0fe1-4e84-8776-8008232effc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfc8849c-0fe1-4e84-8776-8008232effc4" (UID: "cfc8849c-0fe1-4e84-8776-8008232effc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:11:35 crc kubenswrapper[4931]: I1201 16:11:35.652554 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfc8849c-0fe1-4e84-8776-8008232effc4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 16:11:35 crc kubenswrapper[4931]: I1201 16:11:35.652595 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4lgs\" (UniqueName: \"kubernetes.io/projected/cfc8849c-0fe1-4e84-8776-8008232effc4-kube-api-access-q4lgs\") on node \"crc\" DevicePath \"\"" Dec 01 16:11:35 crc kubenswrapper[4931]: I1201 16:11:35.652609 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfc8849c-0fe1-4e84-8776-8008232effc4-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.002863 4931 generic.go:334] "Generic (PLEG): container finished" podID="cfc8849c-0fe1-4e84-8776-8008232effc4" containerID="8fe28b4663ff5478b6378decb5bd3613fc923e7e0f202a19e459f04949029209" exitCode=0 Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.002901 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24kpr" event={"ID":"cfc8849c-0fe1-4e84-8776-8008232effc4","Type":"ContainerDied","Data":"8fe28b4663ff5478b6378decb5bd3613fc923e7e0f202a19e459f04949029209"} Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.002931 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24kpr" event={"ID":"cfc8849c-0fe1-4e84-8776-8008232effc4","Type":"ContainerDied","Data":"901d7f799e49d1ed1df9a17c3f900b00b7441b245fdc5f543383403289135fb0"} Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.002948 4931 scope.go:117] "RemoveContainer" containerID="8fe28b4663ff5478b6378decb5bd3613fc923e7e0f202a19e459f04949029209" Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.002979 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24kpr" Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.023644 4931 scope.go:117] "RemoveContainer" containerID="aacb4d88188dd10f012c66921880547b4cc9c123d3f21c45861b724659eff43c" Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.035997 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-24kpr"] Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.043864 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-24kpr"] Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.253562 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc8849c-0fe1-4e84-8776-8008232effc4" path="/var/lib/kubelet/pods/cfc8849c-0fe1-4e84-8776-8008232effc4/volumes" Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.325955 4931 scope.go:117] "RemoveContainer" containerID="c5bb7650b9dfd22200f80dd748d2f4479a43fba290508efe71cf7d1fd2fbc934" Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.406760 4931 scope.go:117] "RemoveContainer" containerID="8fe28b4663ff5478b6378decb5bd3613fc923e7e0f202a19e459f04949029209" Dec 01 16:11:36 crc kubenswrapper[4931]: E1201 16:11:36.407215 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe28b4663ff5478b6378decb5bd3613fc923e7e0f202a19e459f04949029209\": container with ID starting with 8fe28b4663ff5478b6378decb5bd3613fc923e7e0f202a19e459f04949029209 not found: ID does not exist" containerID="8fe28b4663ff5478b6378decb5bd3613fc923e7e0f202a19e459f04949029209" Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.407291 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe28b4663ff5478b6378decb5bd3613fc923e7e0f202a19e459f04949029209"} err="failed to get container status \"8fe28b4663ff5478b6378decb5bd3613fc923e7e0f202a19e459f04949029209\": rpc error: code = NotFound desc = could not find container \"8fe28b4663ff5478b6378decb5bd3613fc923e7e0f202a19e459f04949029209\": container with ID starting with 8fe28b4663ff5478b6378decb5bd3613fc923e7e0f202a19e459f04949029209 not found: ID does not exist" Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.407336 4931 scope.go:117] "RemoveContainer" containerID="aacb4d88188dd10f012c66921880547b4cc9c123d3f21c45861b724659eff43c" Dec 01 16:11:36 crc kubenswrapper[4931]: E1201 16:11:36.407752 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aacb4d88188dd10f012c66921880547b4cc9c123d3f21c45861b724659eff43c\": container with ID starting with aacb4d88188dd10f012c66921880547b4cc9c123d3f21c45861b724659eff43c not found: ID does not exist" containerID="aacb4d88188dd10f012c66921880547b4cc9c123d3f21c45861b724659eff43c" Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.407791 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aacb4d88188dd10f012c66921880547b4cc9c123d3f21c45861b724659eff43c"} err="failed to get container status \"aacb4d88188dd10f012c66921880547b4cc9c123d3f21c45861b724659eff43c\": rpc error: code = NotFound desc = could not find container \"aacb4d88188dd10f012c66921880547b4cc9c123d3f21c45861b724659eff43c\": container with ID starting with aacb4d88188dd10f012c66921880547b4cc9c123d3f21c45861b724659eff43c not found: ID does not exist" Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.407818 4931 scope.go:117] "RemoveContainer" containerID="c5bb7650b9dfd22200f80dd748d2f4479a43fba290508efe71cf7d1fd2fbc934" Dec 01 16:11:36 crc kubenswrapper[4931]: E1201 16:11:36.408105 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5bb7650b9dfd22200f80dd748d2f4479a43fba290508efe71cf7d1fd2fbc934\": container with ID starting with c5bb7650b9dfd22200f80dd748d2f4479a43fba290508efe71cf7d1fd2fbc934 not found: ID does not exist" containerID="c5bb7650b9dfd22200f80dd748d2f4479a43fba290508efe71cf7d1fd2fbc934" Dec 01 16:11:36 crc kubenswrapper[4931]: I1201 16:11:36.408158 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5bb7650b9dfd22200f80dd748d2f4479a43fba290508efe71cf7d1fd2fbc934"} err="failed to get container status \"c5bb7650b9dfd22200f80dd748d2f4479a43fba290508efe71cf7d1fd2fbc934\": rpc error: code = NotFound desc = could not find container \"c5bb7650b9dfd22200f80dd748d2f4479a43fba290508efe71cf7d1fd2fbc934\": container with ID starting with c5bb7650b9dfd22200f80dd748d2f4479a43fba290508efe71cf7d1fd2fbc934 not found: ID does not exist" Dec 01 16:11:40 crc kubenswrapper[4931]: I1201 16:11:40.563265 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/util/0.log" Dec 01 16:11:40 crc kubenswrapper[4931]: I1201 16:11:40.934978 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/util/0.log" Dec 01 16:11:40 crc kubenswrapper[4931]: I1201 16:11:40.943580 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/pull/0.log" Dec 01 16:11:40 crc kubenswrapper[4931]: I1201 16:11:40.966854 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/pull/0.log" Dec 01 16:11:41 crc kubenswrapper[4931]: I1201 16:11:41.146291 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/util/0.log" Dec 01 16:11:41 crc kubenswrapper[4931]: I1201 16:11:41.181037 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/pull/0.log" Dec 01 16:11:41 crc kubenswrapper[4931]: I1201 16:11:41.203626 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkn5nt_f06697c2-e5a8-47a0-b960-d71e2d3c591a/extract/0.log" Dec 01 16:11:41 crc kubenswrapper[4931]: I1201 16:11:41.334814 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/util/0.log" Dec 01 16:11:41 crc kubenswrapper[4931]: I1201 16:11:41.520491 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/util/0.log" Dec 01 16:11:41 crc kubenswrapper[4931]: I1201 16:11:41.560246 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/pull/0.log" Dec 01 16:11:41 crc kubenswrapper[4931]: I1201 16:11:41.560361 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/pull/0.log" Dec 01 16:11:41 crc kubenswrapper[4931]: I1201 16:11:41.724597 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/pull/0.log" Dec 01 16:11:41 crc kubenswrapper[4931]: I1201 16:11:41.736437 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/extract/0.log" Dec 01 16:11:41 crc kubenswrapper[4931]: I1201 16:11:41.753445 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bqfds_f2d91777-625a-4229-90cb-820f107037e5/util/0.log" Dec 01 16:11:41 crc kubenswrapper[4931]: I1201 16:11:41.905335 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mxt4_00ee5537-c78c-4102-888b-909925be1a82/extract-utilities/0.log" Dec 01 16:11:42 crc kubenswrapper[4931]: I1201 16:11:42.072589 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mxt4_00ee5537-c78c-4102-888b-909925be1a82/extract-content/0.log" Dec 01 16:11:42 crc kubenswrapper[4931]: I1201 16:11:42.085425 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mxt4_00ee5537-c78c-4102-888b-909925be1a82/extract-utilities/0.log" Dec 01 16:11:42 crc kubenswrapper[4931]: I1201 16:11:42.105400 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mxt4_00ee5537-c78c-4102-888b-909925be1a82/extract-content/0.log" Dec 01 16:11:42 crc kubenswrapper[4931]: I1201 16:11:42.255555 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mxt4_00ee5537-c78c-4102-888b-909925be1a82/extract-content/0.log" Dec 01 16:11:42 crc kubenswrapper[4931]: I1201 16:11:42.300514 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mxt4_00ee5537-c78c-4102-888b-909925be1a82/extract-utilities/0.log" Dec 01 16:11:42 crc kubenswrapper[4931]: I1201 16:11:42.303901 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2mxt4_00ee5537-c78c-4102-888b-909925be1a82/registry-server/0.log" Dec 01 16:11:42 crc kubenswrapper[4931]: I1201 16:11:42.421691 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/extract-utilities/0.log" Dec 01 16:11:42 crc kubenswrapper[4931]: I1201 16:11:42.629218 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/extract-utilities/0.log" Dec 01 16:11:42 crc kubenswrapper[4931]: I1201 16:11:42.638897 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/extract-content/0.log" Dec 01 16:11:42 crc kubenswrapper[4931]: I1201 16:11:42.656710 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/extract-content/0.log" Dec 01 16:11:42 crc kubenswrapper[4931]: I1201 16:11:42.902806 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/extract-content/0.log" Dec 01 16:11:42 crc kubenswrapper[4931]: I1201 16:11:42.904679 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/extract-utilities/0.log" Dec 01 16:11:43 crc kubenswrapper[4931]: I1201 16:11:43.128434 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/extract-utilities/0.log" Dec 01 16:11:43 crc kubenswrapper[4931]: I1201 16:11:43.136720 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5cmwr_b4240085-00ab-4edf-b2ca-ae04ea170973/registry-server/0.log" Dec 01 16:11:43 crc kubenswrapper[4931]: I1201 16:11:43.293632 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/extract-content/0.log" Dec 01 16:11:43 crc kubenswrapper[4931]: I1201 16:11:43.299690 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/extract-utilities/0.log" Dec 01 16:11:43 crc kubenswrapper[4931]: I1201 16:11:43.339289 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/extract-content/0.log" Dec 01 16:11:43 crc kubenswrapper[4931]: I1201 16:11:43.518955 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/extract-utilities/0.log" Dec 01 16:11:43 crc kubenswrapper[4931]: I1201 16:11:43.524649 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/extract-content/0.log" Dec 01 16:11:43 crc kubenswrapper[4931]: I1201 16:11:43.793119 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8tfrh_d87b0f89-7ea5-4550-bace-1df5f7c508db/marketplace-operator/0.log" Dec 01 16:11:43 crc kubenswrapper[4931]: I1201 16:11:43.868509 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/extract-utilities/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.141197 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/extract-utilities/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.165092 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/extract-content/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.167994 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/extract-content/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.267173 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rpwj9_83ac02de-c63c-4b14-bcbe-28e0ef9d91f9/registry-server/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.332463 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/extract-content/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.406595 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/extract-utilities/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.463804 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k5l6c_b60c5b6d-f16e-4b68-a761-76678ada1930/registry-server/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.502613 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n85b4_be91a404-d2e8-463a-9aa4-34351d6c67a8/extract-utilities/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.614020 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n85b4_be91a404-d2e8-463a-9aa4-34351d6c67a8/extract-utilities/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.653943 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n85b4_be91a404-d2e8-463a-9aa4-34351d6c67a8/extract-content/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.653946 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n85b4_be91a404-d2e8-463a-9aa4-34351d6c67a8/extract-content/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.768472 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.815397 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2mxt4"] Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.824591 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n85b4_be91a404-d2e8-463a-9aa4-34351d6c67a8/extract-utilities/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.838087 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n85b4_be91a404-d2e8-463a-9aa4-34351d6c67a8/extract-content/0.log" Dec 01 16:11:44 crc kubenswrapper[4931]: I1201 16:11:44.981997 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n85b4_be91a404-d2e8-463a-9aa4-34351d6c67a8/registry-server/0.log" Dec 01 16:11:45 crc kubenswrapper[4931]: I1201 16:11:45.128742 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2mxt4" podUID="00ee5537-c78c-4102-888b-909925be1a82" containerName="registry-server" containerID="cri-o://7454414de8ea89c22d03a697e4fbc0408ac096ee5cc9d9501aea1b7ade55b7b6" gracePeriod=2 Dec 01 16:11:45 crc kubenswrapper[4931]: I1201 16:11:45.585052 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:45 crc kubenswrapper[4931]: I1201 16:11:45.631720 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phjxt\" (UniqueName: \"kubernetes.io/projected/00ee5537-c78c-4102-888b-909925be1a82-kube-api-access-phjxt\") pod \"00ee5537-c78c-4102-888b-909925be1a82\" (UID: \"00ee5537-c78c-4102-888b-909925be1a82\") " Dec 01 16:11:45 crc kubenswrapper[4931]: I1201 16:11:45.631782 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00ee5537-c78c-4102-888b-909925be1a82-catalog-content\") pod \"00ee5537-c78c-4102-888b-909925be1a82\" (UID: \"00ee5537-c78c-4102-888b-909925be1a82\") " Dec 01 16:11:45 crc kubenswrapper[4931]: I1201 16:11:45.631859 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00ee5537-c78c-4102-888b-909925be1a82-utilities\") pod \"00ee5537-c78c-4102-888b-909925be1a82\" (UID: \"00ee5537-c78c-4102-888b-909925be1a82\") " Dec 01 16:11:45 crc kubenswrapper[4931]: I1201 16:11:45.632664 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00ee5537-c78c-4102-888b-909925be1a82-utilities" (OuterVolumeSpecName: "utilities") pod "00ee5537-c78c-4102-888b-909925be1a82" (UID: "00ee5537-c78c-4102-888b-909925be1a82"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:11:45 crc kubenswrapper[4931]: I1201 16:11:45.652629 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ee5537-c78c-4102-888b-909925be1a82-kube-api-access-phjxt" (OuterVolumeSpecName: "kube-api-access-phjxt") pod "00ee5537-c78c-4102-888b-909925be1a82" (UID: "00ee5537-c78c-4102-888b-909925be1a82"). InnerVolumeSpecName "kube-api-access-phjxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:11:45 crc kubenswrapper[4931]: I1201 16:11:45.681938 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00ee5537-c78c-4102-888b-909925be1a82-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00ee5537-c78c-4102-888b-909925be1a82" (UID: "00ee5537-c78c-4102-888b-909925be1a82"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:11:45 crc kubenswrapper[4931]: I1201 16:11:45.734580 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phjxt\" (UniqueName: \"kubernetes.io/projected/00ee5537-c78c-4102-888b-909925be1a82-kube-api-access-phjxt\") on node \"crc\" DevicePath \"\"" Dec 01 16:11:45 crc kubenswrapper[4931]: I1201 16:11:45.734612 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00ee5537-c78c-4102-888b-909925be1a82-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 16:11:45 crc kubenswrapper[4931]: I1201 16:11:45.734622 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00ee5537-c78c-4102-888b-909925be1a82-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.149891 4931 generic.go:334] "Generic (PLEG): container finished" podID="00ee5537-c78c-4102-888b-909925be1a82" containerID="7454414de8ea89c22d03a697e4fbc0408ac096ee5cc9d9501aea1b7ade55b7b6" exitCode=0 Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.149951 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mxt4" event={"ID":"00ee5537-c78c-4102-888b-909925be1a82","Type":"ContainerDied","Data":"7454414de8ea89c22d03a697e4fbc0408ac096ee5cc9d9501aea1b7ade55b7b6"} Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.149983 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mxt4" event={"ID":"00ee5537-c78c-4102-888b-909925be1a82","Type":"ContainerDied","Data":"ad123251916acb710f24a51d6841aef64cb255390825abd86c2d651cf3664c04"} Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.150013 4931 scope.go:117] "RemoveContainer" containerID="7454414de8ea89c22d03a697e4fbc0408ac096ee5cc9d9501aea1b7ade55b7b6" Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.150236 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mxt4" Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.180975 4931 scope.go:117] "RemoveContainer" containerID="bfae6d36cfe640c8440eb11c207a6063955c9415798a1c5052bf2f01fc3558d2" Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.187039 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2mxt4"] Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.210408 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2mxt4"] Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.217415 4931 scope.go:117] "RemoveContainer" containerID="b8d11b4bf7fbb1c18ba75f6a37e9610147994f7f8b6270a77866bdf713055294" Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.249302 4931 scope.go:117] "RemoveContainer" containerID="7454414de8ea89c22d03a697e4fbc0408ac096ee5cc9d9501aea1b7ade55b7b6" Dec 01 16:11:46 crc kubenswrapper[4931]: E1201 16:11:46.250072 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7454414de8ea89c22d03a697e4fbc0408ac096ee5cc9d9501aea1b7ade55b7b6\": container with ID starting with 7454414de8ea89c22d03a697e4fbc0408ac096ee5cc9d9501aea1b7ade55b7b6 not found: ID does not exist" containerID="7454414de8ea89c22d03a697e4fbc0408ac096ee5cc9d9501aea1b7ade55b7b6" Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.250103 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7454414de8ea89c22d03a697e4fbc0408ac096ee5cc9d9501aea1b7ade55b7b6"} err="failed to get container status \"7454414de8ea89c22d03a697e4fbc0408ac096ee5cc9d9501aea1b7ade55b7b6\": rpc error: code = NotFound desc = could not find container \"7454414de8ea89c22d03a697e4fbc0408ac096ee5cc9d9501aea1b7ade55b7b6\": container with ID starting with 7454414de8ea89c22d03a697e4fbc0408ac096ee5cc9d9501aea1b7ade55b7b6 not found: ID does not exist" Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.250124 4931 scope.go:117] "RemoveContainer" containerID="bfae6d36cfe640c8440eb11c207a6063955c9415798a1c5052bf2f01fc3558d2" Dec 01 16:11:46 crc kubenswrapper[4931]: E1201 16:11:46.250347 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfae6d36cfe640c8440eb11c207a6063955c9415798a1c5052bf2f01fc3558d2\": container with ID starting with bfae6d36cfe640c8440eb11c207a6063955c9415798a1c5052bf2f01fc3558d2 not found: ID does not exist" containerID="bfae6d36cfe640c8440eb11c207a6063955c9415798a1c5052bf2f01fc3558d2" Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.250370 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfae6d36cfe640c8440eb11c207a6063955c9415798a1c5052bf2f01fc3558d2"} err="failed to get container status \"bfae6d36cfe640c8440eb11c207a6063955c9415798a1c5052bf2f01fc3558d2\": rpc error: code = NotFound desc = could not find container \"bfae6d36cfe640c8440eb11c207a6063955c9415798a1c5052bf2f01fc3558d2\": container with ID starting with bfae6d36cfe640c8440eb11c207a6063955c9415798a1c5052bf2f01fc3558d2 not found: ID does not exist" Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.250428 4931 scope.go:117] "RemoveContainer" containerID="b8d11b4bf7fbb1c18ba75f6a37e9610147994f7f8b6270a77866bdf713055294" Dec 01 16:11:46 crc kubenswrapper[4931]: E1201 16:11:46.251354 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d11b4bf7fbb1c18ba75f6a37e9610147994f7f8b6270a77866bdf713055294\": container with ID starting with b8d11b4bf7fbb1c18ba75f6a37e9610147994f7f8b6270a77866bdf713055294 not found: ID does not exist" containerID="b8d11b4bf7fbb1c18ba75f6a37e9610147994f7f8b6270a77866bdf713055294" Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.251382 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d11b4bf7fbb1c18ba75f6a37e9610147994f7f8b6270a77866bdf713055294"} err="failed to get container status \"b8d11b4bf7fbb1c18ba75f6a37e9610147994f7f8b6270a77866bdf713055294\": rpc error: code = NotFound desc = could not find container \"b8d11b4bf7fbb1c18ba75f6a37e9610147994f7f8b6270a77866bdf713055294\": container with ID starting with b8d11b4bf7fbb1c18ba75f6a37e9610147994f7f8b6270a77866bdf713055294 not found: ID does not exist" Dec 01 16:11:46 crc kubenswrapper[4931]: I1201 16:11:46.255711 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ee5537-c78c-4102-888b-909925be1a82" path="/var/lib/kubelet/pods/00ee5537-c78c-4102-888b-909925be1a82/volumes" Dec 01 16:12:17 crc kubenswrapper[4931]: E1201 16:12:17.530563 4931 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.2:39124->38.102.83.2:42215: write tcp 38.102.83.2:39124->38.102.83.2:42215: write: broken pipe Dec 01 16:12:19 crc kubenswrapper[4931]: I1201 16:12:19.872221 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 16:12:19 crc kubenswrapper[4931]: I1201 16:12:19.872330 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 16:12:49 crc kubenswrapper[4931]: I1201 16:12:49.871808 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 16:12:49 crc kubenswrapper[4931]: I1201 16:12:49.872534 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 16:13:19 crc kubenswrapper[4931]: I1201 16:13:19.872324 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 16:13:19 crc kubenswrapper[4931]: I1201 16:13:19.873257 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 16:13:19 crc kubenswrapper[4931]: I1201 16:13:19.873300 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 16:13:19 crc kubenswrapper[4931]: I1201 16:13:19.873956 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a361845d1da44cb7b29595626124fca03becd0f451d39200eda9b78eee72f1b"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 16:13:19 crc kubenswrapper[4931]: I1201 16:13:19.874010 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://9a361845d1da44cb7b29595626124fca03becd0f451d39200eda9b78eee72f1b" gracePeriod=600 Dec 01 16:13:20 crc kubenswrapper[4931]: I1201 16:13:20.148033 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="9a361845d1da44cb7b29595626124fca03becd0f451d39200eda9b78eee72f1b" exitCode=0 Dec 01 16:13:20 crc kubenswrapper[4931]: I1201 16:13:20.148084 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"9a361845d1da44cb7b29595626124fca03becd0f451d39200eda9b78eee72f1b"} Dec 01 16:13:20 crc kubenswrapper[4931]: I1201 16:13:20.148171 4931 scope.go:117] "RemoveContainer" containerID="cf5d9f863a86f6347d29825cedd5830c549b4ca88054647000cf994d5ec083fc" Dec 01 16:13:21 crc kubenswrapper[4931]: I1201 16:13:21.166005 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerStarted","Data":"0926bec4c83eea074c2b1cb5cba76e7be987db0779aec9324b806c6fcd1a96d3"} Dec 01 16:13:22 crc kubenswrapper[4931]: I1201 16:13:22.179893 4931 generic.go:334] "Generic (PLEG): container finished" podID="a59b11df-2c97-4488-93d7-b4ce4e125e80" containerID="6f939bdf9fb2b88a5cdd784047392c3643a6cb6c88c2d18d7517d8fe351b9f55" exitCode=0 Dec 01 16:13:22 crc kubenswrapper[4931]: I1201 16:13:22.180001 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4xsrk/must-gather-j4z5z" event={"ID":"a59b11df-2c97-4488-93d7-b4ce4e125e80","Type":"ContainerDied","Data":"6f939bdf9fb2b88a5cdd784047392c3643a6cb6c88c2d18d7517d8fe351b9f55"} Dec 01 16:13:22 crc kubenswrapper[4931]: I1201 16:13:22.181415 4931 scope.go:117] "RemoveContainer" containerID="6f939bdf9fb2b88a5cdd784047392c3643a6cb6c88c2d18d7517d8fe351b9f55" Dec 01 16:13:22 crc kubenswrapper[4931]: I1201 16:13:22.948662 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4xsrk_must-gather-j4z5z_a59b11df-2c97-4488-93d7-b4ce4e125e80/gather/0.log" Dec 01 16:13:33 crc kubenswrapper[4931]: I1201 16:13:33.760365 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4xsrk/must-gather-j4z5z"] Dec 01 16:13:33 crc kubenswrapper[4931]: I1201 16:13:33.761252 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4xsrk/must-gather-j4z5z" podUID="a59b11df-2c97-4488-93d7-b4ce4e125e80" containerName="copy" containerID="cri-o://832164bc3d0c7746d9040e0371447092d9d0ded96626a23ff06a288e756fd7c8" gracePeriod=2 Dec 01 16:13:33 crc kubenswrapper[4931]: I1201 16:13:33.771443 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4xsrk/must-gather-j4z5z"] Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.214253 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4xsrk_must-gather-j4z5z_a59b11df-2c97-4488-93d7-b4ce4e125e80/copy/0.log" Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.214951 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/must-gather-j4z5z" Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.290158 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a59b11df-2c97-4488-93d7-b4ce4e125e80-must-gather-output\") pod \"a59b11df-2c97-4488-93d7-b4ce4e125e80\" (UID: \"a59b11df-2c97-4488-93d7-b4ce4e125e80\") " Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.290276 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6mz8\" (UniqueName: \"kubernetes.io/projected/a59b11df-2c97-4488-93d7-b4ce4e125e80-kube-api-access-d6mz8\") pod \"a59b11df-2c97-4488-93d7-b4ce4e125e80\" (UID: \"a59b11df-2c97-4488-93d7-b4ce4e125e80\") " Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.297615 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59b11df-2c97-4488-93d7-b4ce4e125e80-kube-api-access-d6mz8" (OuterVolumeSpecName: "kube-api-access-d6mz8") pod "a59b11df-2c97-4488-93d7-b4ce4e125e80" (UID: "a59b11df-2c97-4488-93d7-b4ce4e125e80"). InnerVolumeSpecName "kube-api-access-d6mz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.305185 4931 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4xsrk_must-gather-j4z5z_a59b11df-2c97-4488-93d7-b4ce4e125e80/copy/0.log" Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.305518 4931 generic.go:334] "Generic (PLEG): container finished" podID="a59b11df-2c97-4488-93d7-b4ce4e125e80" containerID="832164bc3d0c7746d9040e0371447092d9d0ded96626a23ff06a288e756fd7c8" exitCode=143 Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.305572 4931 scope.go:117] "RemoveContainer" containerID="832164bc3d0c7746d9040e0371447092d9d0ded96626a23ff06a288e756fd7c8" Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.305712 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4xsrk/must-gather-j4z5z" Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.370032 4931 scope.go:117] "RemoveContainer" containerID="6f939bdf9fb2b88a5cdd784047392c3643a6cb6c88c2d18d7517d8fe351b9f55" Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.392943 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6mz8\" (UniqueName: \"kubernetes.io/projected/a59b11df-2c97-4488-93d7-b4ce4e125e80-kube-api-access-d6mz8\") on node \"crc\" DevicePath \"\"" Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.437790 4931 scope.go:117] "RemoveContainer" containerID="832164bc3d0c7746d9040e0371447092d9d0ded96626a23ff06a288e756fd7c8" Dec 01 16:13:34 crc kubenswrapper[4931]: E1201 16:13:34.438185 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"832164bc3d0c7746d9040e0371447092d9d0ded96626a23ff06a288e756fd7c8\": container with ID starting with 832164bc3d0c7746d9040e0371447092d9d0ded96626a23ff06a288e756fd7c8 not found: ID does not exist" containerID="832164bc3d0c7746d9040e0371447092d9d0ded96626a23ff06a288e756fd7c8" Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.438219 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832164bc3d0c7746d9040e0371447092d9d0ded96626a23ff06a288e756fd7c8"} err="failed to get container status \"832164bc3d0c7746d9040e0371447092d9d0ded96626a23ff06a288e756fd7c8\": rpc error: code = NotFound desc = could not find container \"832164bc3d0c7746d9040e0371447092d9d0ded96626a23ff06a288e756fd7c8\": container with ID starting with 832164bc3d0c7746d9040e0371447092d9d0ded96626a23ff06a288e756fd7c8 not found: ID does not exist" Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.438239 4931 scope.go:117] "RemoveContainer" containerID="6f939bdf9fb2b88a5cdd784047392c3643a6cb6c88c2d18d7517d8fe351b9f55" Dec 01 16:13:34 crc kubenswrapper[4931]: E1201 16:13:34.439116 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f939bdf9fb2b88a5cdd784047392c3643a6cb6c88c2d18d7517d8fe351b9f55\": container with ID starting with 6f939bdf9fb2b88a5cdd784047392c3643a6cb6c88c2d18d7517d8fe351b9f55 not found: ID does not exist" containerID="6f939bdf9fb2b88a5cdd784047392c3643a6cb6c88c2d18d7517d8fe351b9f55" Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.439138 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f939bdf9fb2b88a5cdd784047392c3643a6cb6c88c2d18d7517d8fe351b9f55"} err="failed to get container status \"6f939bdf9fb2b88a5cdd784047392c3643a6cb6c88c2d18d7517d8fe351b9f55\": rpc error: code = NotFound desc = could not find container \"6f939bdf9fb2b88a5cdd784047392c3643a6cb6c88c2d18d7517d8fe351b9f55\": container with ID starting with 6f939bdf9fb2b88a5cdd784047392c3643a6cb6c88c2d18d7517d8fe351b9f55 not found: ID does not exist" Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.454356 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a59b11df-2c97-4488-93d7-b4ce4e125e80-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a59b11df-2c97-4488-93d7-b4ce4e125e80" (UID: "a59b11df-2c97-4488-93d7-b4ce4e125e80"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:13:34 crc kubenswrapper[4931]: I1201 16:13:34.494525 4931 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a59b11df-2c97-4488-93d7-b4ce4e125e80-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 16:13:36 crc kubenswrapper[4931]: I1201 16:13:36.255890 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a59b11df-2c97-4488-93d7-b4ce4e125e80" path="/var/lib/kubelet/pods/a59b11df-2c97-4488-93d7-b4ce4e125e80/volumes" Dec 01 16:14:39 crc kubenswrapper[4931]: I1201 16:14:39.569061 4931 scope.go:117] "RemoveContainer" containerID="42a0e553c196d833ed427cc59ba4742cfe5f3036cc8befae44a55f20d4464a45" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.177755 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52"] Dec 01 16:15:00 crc kubenswrapper[4931]: E1201 16:15:00.178630 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ee5537-c78c-4102-888b-909925be1a82" containerName="extract-content" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.178644 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ee5537-c78c-4102-888b-909925be1a82" containerName="extract-content" Dec 01 16:15:00 crc kubenswrapper[4931]: E1201 16:15:00.178661 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59b11df-2c97-4488-93d7-b4ce4e125e80" containerName="gather" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.178667 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59b11df-2c97-4488-93d7-b4ce4e125e80" containerName="gather" Dec 01 16:15:00 crc kubenswrapper[4931]: E1201 16:15:00.178676 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ee5537-c78c-4102-888b-909925be1a82" containerName="registry-server" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.178682 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ee5537-c78c-4102-888b-909925be1a82" containerName="registry-server" Dec 01 16:15:00 crc kubenswrapper[4931]: E1201 16:15:00.178698 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc8849c-0fe1-4e84-8776-8008232effc4" containerName="extract-content" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.178705 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc8849c-0fe1-4e84-8776-8008232effc4" containerName="extract-content" Dec 01 16:15:00 crc kubenswrapper[4931]: E1201 16:15:00.178715 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59b11df-2c97-4488-93d7-b4ce4e125e80" containerName="copy" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.178721 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59b11df-2c97-4488-93d7-b4ce4e125e80" containerName="copy" Dec 01 16:15:00 crc kubenswrapper[4931]: E1201 16:15:00.178734 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc8849c-0fe1-4e84-8776-8008232effc4" containerName="extract-utilities" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.178740 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc8849c-0fe1-4e84-8776-8008232effc4" containerName="extract-utilities" Dec 01 16:15:00 crc kubenswrapper[4931]: E1201 16:15:00.178754 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ee5537-c78c-4102-888b-909925be1a82" containerName="extract-utilities" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.178759 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ee5537-c78c-4102-888b-909925be1a82" containerName="extract-utilities" Dec 01 16:15:00 crc kubenswrapper[4931]: E1201 16:15:00.178776 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc8849c-0fe1-4e84-8776-8008232effc4" containerName="registry-server" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.178781 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc8849c-0fe1-4e84-8776-8008232effc4" containerName="registry-server" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.178944 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc8849c-0fe1-4e84-8776-8008232effc4" containerName="registry-server" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.178960 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59b11df-2c97-4488-93d7-b4ce4e125e80" containerName="copy" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.178970 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59b11df-2c97-4488-93d7-b4ce4e125e80" containerName="gather" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.178985 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ee5537-c78c-4102-888b-909925be1a82" containerName="registry-server" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.179561 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.184270 4931 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.184685 4931 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.199666 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52"] Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.298458 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69vwp\" (UniqueName: \"kubernetes.io/projected/231f3f1a-4a81-4c5c-a42f-604fbde82083-kube-api-access-69vwp\") pod \"collect-profiles-29410095-sfz52\" (UID: \"231f3f1a-4a81-4c5c-a42f-604fbde82083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.298811 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/231f3f1a-4a81-4c5c-a42f-604fbde82083-config-volume\") pod \"collect-profiles-29410095-sfz52\" (UID: \"231f3f1a-4a81-4c5c-a42f-604fbde82083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.298934 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/231f3f1a-4a81-4c5c-a42f-604fbde82083-secret-volume\") pod \"collect-profiles-29410095-sfz52\" (UID: \"231f3f1a-4a81-4c5c-a42f-604fbde82083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.400655 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69vwp\" (UniqueName: \"kubernetes.io/projected/231f3f1a-4a81-4c5c-a42f-604fbde82083-kube-api-access-69vwp\") pod \"collect-profiles-29410095-sfz52\" (UID: \"231f3f1a-4a81-4c5c-a42f-604fbde82083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.400863 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/231f3f1a-4a81-4c5c-a42f-604fbde82083-config-volume\") pod \"collect-profiles-29410095-sfz52\" (UID: \"231f3f1a-4a81-4c5c-a42f-604fbde82083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.400964 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/231f3f1a-4a81-4c5c-a42f-604fbde82083-secret-volume\") pod \"collect-profiles-29410095-sfz52\" (UID: \"231f3f1a-4a81-4c5c-a42f-604fbde82083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.402125 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/231f3f1a-4a81-4c5c-a42f-604fbde82083-config-volume\") pod \"collect-profiles-29410095-sfz52\" (UID: \"231f3f1a-4a81-4c5c-a42f-604fbde82083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.408657 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/231f3f1a-4a81-4c5c-a42f-604fbde82083-secret-volume\") pod \"collect-profiles-29410095-sfz52\" (UID: \"231f3f1a-4a81-4c5c-a42f-604fbde82083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.418892 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69vwp\" (UniqueName: \"kubernetes.io/projected/231f3f1a-4a81-4c5c-a42f-604fbde82083-kube-api-access-69vwp\") pod \"collect-profiles-29410095-sfz52\" (UID: \"231f3f1a-4a81-4c5c-a42f-604fbde82083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" Dec 01 16:15:00 crc kubenswrapper[4931]: I1201 16:15:00.504877 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" Dec 01 16:15:01 crc kubenswrapper[4931]: I1201 16:15:01.040852 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52"] Dec 01 16:15:01 crc kubenswrapper[4931]: I1201 16:15:01.186120 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" event={"ID":"231f3f1a-4a81-4c5c-a42f-604fbde82083","Type":"ContainerStarted","Data":"452450030a3aa389f5299489c35f27fc30b4839b059794d861512de6e0f320ef"} Dec 01 16:15:02 crc kubenswrapper[4931]: I1201 16:15:02.202797 4931 generic.go:334] "Generic (PLEG): container finished" podID="231f3f1a-4a81-4c5c-a42f-604fbde82083" containerID="fdd0d5b90ad27a23a8cb6f894e083a455e80a440ad047be28b4bc7dc0a608fce" exitCode=0 Dec 01 16:15:02 crc kubenswrapper[4931]: I1201 16:15:02.203107 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" event={"ID":"231f3f1a-4a81-4c5c-a42f-604fbde82083","Type":"ContainerDied","Data":"fdd0d5b90ad27a23a8cb6f894e083a455e80a440ad047be28b4bc7dc0a608fce"} Dec 01 16:15:03 crc kubenswrapper[4931]: I1201 16:15:03.667154 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" Dec 01 16:15:03 crc kubenswrapper[4931]: I1201 16:15:03.774826 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/231f3f1a-4a81-4c5c-a42f-604fbde82083-config-volume\") pod \"231f3f1a-4a81-4c5c-a42f-604fbde82083\" (UID: \"231f3f1a-4a81-4c5c-a42f-604fbde82083\") " Dec 01 16:15:03 crc kubenswrapper[4931]: I1201 16:15:03.775063 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69vwp\" (UniqueName: \"kubernetes.io/projected/231f3f1a-4a81-4c5c-a42f-604fbde82083-kube-api-access-69vwp\") pod \"231f3f1a-4a81-4c5c-a42f-604fbde82083\" (UID: \"231f3f1a-4a81-4c5c-a42f-604fbde82083\") " Dec 01 16:15:03 crc kubenswrapper[4931]: I1201 16:15:03.775104 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/231f3f1a-4a81-4c5c-a42f-604fbde82083-secret-volume\") pod \"231f3f1a-4a81-4c5c-a42f-604fbde82083\" (UID: \"231f3f1a-4a81-4c5c-a42f-604fbde82083\") " Dec 01 16:15:03 crc kubenswrapper[4931]: I1201 16:15:03.776029 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/231f3f1a-4a81-4c5c-a42f-604fbde82083-config-volume" (OuterVolumeSpecName: "config-volume") pod "231f3f1a-4a81-4c5c-a42f-604fbde82083" (UID: "231f3f1a-4a81-4c5c-a42f-604fbde82083"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 16:15:03 crc kubenswrapper[4931]: I1201 16:15:03.781266 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231f3f1a-4a81-4c5c-a42f-604fbde82083-kube-api-access-69vwp" (OuterVolumeSpecName: "kube-api-access-69vwp") pod "231f3f1a-4a81-4c5c-a42f-604fbde82083" (UID: "231f3f1a-4a81-4c5c-a42f-604fbde82083"). InnerVolumeSpecName "kube-api-access-69vwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:15:03 crc kubenswrapper[4931]: I1201 16:15:03.782561 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231f3f1a-4a81-4c5c-a42f-604fbde82083-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "231f3f1a-4a81-4c5c-a42f-604fbde82083" (UID: "231f3f1a-4a81-4c5c-a42f-604fbde82083"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 16:15:03 crc kubenswrapper[4931]: I1201 16:15:03.876953 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69vwp\" (UniqueName: \"kubernetes.io/projected/231f3f1a-4a81-4c5c-a42f-604fbde82083-kube-api-access-69vwp\") on node \"crc\" DevicePath \"\"" Dec 01 16:15:03 crc kubenswrapper[4931]: I1201 16:15:03.876992 4931 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/231f3f1a-4a81-4c5c-a42f-604fbde82083-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 16:15:03 crc kubenswrapper[4931]: I1201 16:15:03.877006 4931 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/231f3f1a-4a81-4c5c-a42f-604fbde82083-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 16:15:04 crc kubenswrapper[4931]: I1201 16:15:04.221986 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" event={"ID":"231f3f1a-4a81-4c5c-a42f-604fbde82083","Type":"ContainerDied","Data":"452450030a3aa389f5299489c35f27fc30b4839b059794d861512de6e0f320ef"} Dec 01 16:15:04 crc kubenswrapper[4931]: I1201 16:15:04.222323 4931 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="452450030a3aa389f5299489c35f27fc30b4839b059794d861512de6e0f320ef" Dec 01 16:15:04 crc kubenswrapper[4931]: I1201 16:15:04.222374 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410095-sfz52" Dec 01 16:15:04 crc kubenswrapper[4931]: I1201 16:15:04.751707 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz"] Dec 01 16:15:04 crc kubenswrapper[4931]: I1201 16:15:04.769477 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410050-p9hrz"] Dec 01 16:15:06 crc kubenswrapper[4931]: I1201 16:15:06.253802 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="268742df-a602-429b-887a-f25239e66bfb" path="/var/lib/kubelet/pods/268742df-a602-429b-887a-f25239e66bfb/volumes" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.616188 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-phfpl"] Dec 01 16:15:19 crc kubenswrapper[4931]: E1201 16:15:19.617362 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231f3f1a-4a81-4c5c-a42f-604fbde82083" containerName="collect-profiles" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.617479 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="231f3f1a-4a81-4c5c-a42f-604fbde82083" containerName="collect-profiles" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.617862 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="231f3f1a-4a81-4c5c-a42f-604fbde82083" containerName="collect-profiles" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.619963 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.633931 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phfpl"] Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.784755 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a813fa-5946-4d12-8dee-0b979756451b-utilities\") pod \"redhat-operators-phfpl\" (UID: \"f9a813fa-5946-4d12-8dee-0b979756451b\") " pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.784833 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5q4p\" (UniqueName: \"kubernetes.io/projected/f9a813fa-5946-4d12-8dee-0b979756451b-kube-api-access-m5q4p\") pod \"redhat-operators-phfpl\" (UID: \"f9a813fa-5946-4d12-8dee-0b979756451b\") " pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.785288 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a813fa-5946-4d12-8dee-0b979756451b-catalog-content\") pod \"redhat-operators-phfpl\" (UID: \"f9a813fa-5946-4d12-8dee-0b979756451b\") " pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.887130 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a813fa-5946-4d12-8dee-0b979756451b-utilities\") pod \"redhat-operators-phfpl\" (UID: \"f9a813fa-5946-4d12-8dee-0b979756451b\") " pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.887495 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5q4p\" (UniqueName: \"kubernetes.io/projected/f9a813fa-5946-4d12-8dee-0b979756451b-kube-api-access-m5q4p\") pod \"redhat-operators-phfpl\" (UID: \"f9a813fa-5946-4d12-8dee-0b979756451b\") " pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.887623 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a813fa-5946-4d12-8dee-0b979756451b-catalog-content\") pod \"redhat-operators-phfpl\" (UID: \"f9a813fa-5946-4d12-8dee-0b979756451b\") " pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.887785 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a813fa-5946-4d12-8dee-0b979756451b-utilities\") pod \"redhat-operators-phfpl\" (UID: \"f9a813fa-5946-4d12-8dee-0b979756451b\") " pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.888157 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a813fa-5946-4d12-8dee-0b979756451b-catalog-content\") pod \"redhat-operators-phfpl\" (UID: \"f9a813fa-5946-4d12-8dee-0b979756451b\") " pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.908812 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5q4p\" (UniqueName: \"kubernetes.io/projected/f9a813fa-5946-4d12-8dee-0b979756451b-kube-api-access-m5q4p\") pod \"redhat-operators-phfpl\" (UID: \"f9a813fa-5946-4d12-8dee-0b979756451b\") " pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:19 crc kubenswrapper[4931]: I1201 16:15:19.960963 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:20 crc kubenswrapper[4931]: I1201 16:15:20.415087 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phfpl"] Dec 01 16:15:21 crc kubenswrapper[4931]: I1201 16:15:21.414776 4931 generic.go:334] "Generic (PLEG): container finished" podID="f9a813fa-5946-4d12-8dee-0b979756451b" containerID="707d280741992ecb4293108e16e3fb87983e488ad90aaf24bef57878dd18f1bf" exitCode=0 Dec 01 16:15:21 crc kubenswrapper[4931]: I1201 16:15:21.414859 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfpl" event={"ID":"f9a813fa-5946-4d12-8dee-0b979756451b","Type":"ContainerDied","Data":"707d280741992ecb4293108e16e3fb87983e488ad90aaf24bef57878dd18f1bf"} Dec 01 16:15:21 crc kubenswrapper[4931]: I1201 16:15:21.415364 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfpl" event={"ID":"f9a813fa-5946-4d12-8dee-0b979756451b","Type":"ContainerStarted","Data":"264a835b68ac480f7786449a06e02093faf50b5d34723b32d37a80f70c1856d1"} Dec 01 16:15:22 crc kubenswrapper[4931]: I1201 16:15:22.427354 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfpl" event={"ID":"f9a813fa-5946-4d12-8dee-0b979756451b","Type":"ContainerStarted","Data":"e320e03cbed3b6ccf64bfa3d59226ffca2a5fdf6de7dbcd9d269c7076199b9f5"} Dec 01 16:15:23 crc kubenswrapper[4931]: I1201 16:15:23.439167 4931 generic.go:334] "Generic (PLEG): container finished" podID="f9a813fa-5946-4d12-8dee-0b979756451b" containerID="e320e03cbed3b6ccf64bfa3d59226ffca2a5fdf6de7dbcd9d269c7076199b9f5" exitCode=0 Dec 01 16:15:23 crc kubenswrapper[4931]: I1201 16:15:23.439226 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfpl" event={"ID":"f9a813fa-5946-4d12-8dee-0b979756451b","Type":"ContainerDied","Data":"e320e03cbed3b6ccf64bfa3d59226ffca2a5fdf6de7dbcd9d269c7076199b9f5"} Dec 01 16:15:24 crc kubenswrapper[4931]: I1201 16:15:24.448649 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfpl" event={"ID":"f9a813fa-5946-4d12-8dee-0b979756451b","Type":"ContainerStarted","Data":"25c595277070afbb2fc8501e7a6629c402c7c1e189982a5a16f22389b04b8479"} Dec 01 16:15:24 crc kubenswrapper[4931]: I1201 16:15:24.473115 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-phfpl" podStartSLOduration=3.017614373 podStartE2EDuration="5.473096755s" podCreationTimestamp="2025-12-01 16:15:19 +0000 UTC" firstStartedPulling="2025-12-01 16:15:21.422473726 +0000 UTC m=+4467.848347403" lastFinishedPulling="2025-12-01 16:15:23.877956118 +0000 UTC m=+4470.303829785" observedRunningTime="2025-12-01 16:15:24.469534205 +0000 UTC m=+4470.895407882" watchObservedRunningTime="2025-12-01 16:15:24.473096755 +0000 UTC m=+4470.898970422" Dec 01 16:15:29 crc kubenswrapper[4931]: I1201 16:15:29.961896 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:29 crc kubenswrapper[4931]: I1201 16:15:29.963649 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:30 crc kubenswrapper[4931]: I1201 16:15:30.383619 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:30 crc kubenswrapper[4931]: I1201 16:15:30.559441 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:30 crc kubenswrapper[4931]: I1201 16:15:30.632154 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phfpl"] Dec 01 16:15:32 crc kubenswrapper[4931]: I1201 16:15:32.517124 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-phfpl" podUID="f9a813fa-5946-4d12-8dee-0b979756451b" containerName="registry-server" containerID="cri-o://25c595277070afbb2fc8501e7a6629c402c7c1e189982a5a16f22389b04b8479" gracePeriod=2 Dec 01 16:15:33 crc kubenswrapper[4931]: I1201 16:15:33.532884 4931 generic.go:334] "Generic (PLEG): container finished" podID="f9a813fa-5946-4d12-8dee-0b979756451b" containerID="25c595277070afbb2fc8501e7a6629c402c7c1e189982a5a16f22389b04b8479" exitCode=0 Dec 01 16:15:33 crc kubenswrapper[4931]: I1201 16:15:33.532961 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfpl" event={"ID":"f9a813fa-5946-4d12-8dee-0b979756451b","Type":"ContainerDied","Data":"25c595277070afbb2fc8501e7a6629c402c7c1e189982a5a16f22389b04b8479"} Dec 01 16:15:33 crc kubenswrapper[4931]: I1201 16:15:33.614172 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:33 crc kubenswrapper[4931]: I1201 16:15:33.802291 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a813fa-5946-4d12-8dee-0b979756451b-utilities\") pod \"f9a813fa-5946-4d12-8dee-0b979756451b\" (UID: \"f9a813fa-5946-4d12-8dee-0b979756451b\") " Dec 01 16:15:33 crc kubenswrapper[4931]: I1201 16:15:33.803199 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9a813fa-5946-4d12-8dee-0b979756451b-utilities" (OuterVolumeSpecName: "utilities") pod "f9a813fa-5946-4d12-8dee-0b979756451b" (UID: "f9a813fa-5946-4d12-8dee-0b979756451b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:15:33 crc kubenswrapper[4931]: I1201 16:15:33.803649 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5q4p\" (UniqueName: \"kubernetes.io/projected/f9a813fa-5946-4d12-8dee-0b979756451b-kube-api-access-m5q4p\") pod \"f9a813fa-5946-4d12-8dee-0b979756451b\" (UID: \"f9a813fa-5946-4d12-8dee-0b979756451b\") " Dec 01 16:15:33 crc kubenswrapper[4931]: I1201 16:15:33.803743 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a813fa-5946-4d12-8dee-0b979756451b-catalog-content\") pod \"f9a813fa-5946-4d12-8dee-0b979756451b\" (UID: \"f9a813fa-5946-4d12-8dee-0b979756451b\") " Dec 01 16:15:33 crc kubenswrapper[4931]: I1201 16:15:33.804381 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a813fa-5946-4d12-8dee-0b979756451b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 16:15:33 crc kubenswrapper[4931]: I1201 16:15:33.812566 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a813fa-5946-4d12-8dee-0b979756451b-kube-api-access-m5q4p" (OuterVolumeSpecName: "kube-api-access-m5q4p") pod "f9a813fa-5946-4d12-8dee-0b979756451b" (UID: "f9a813fa-5946-4d12-8dee-0b979756451b"). InnerVolumeSpecName "kube-api-access-m5q4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:15:33 crc kubenswrapper[4931]: I1201 16:15:33.907469 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5q4p\" (UniqueName: \"kubernetes.io/projected/f9a813fa-5946-4d12-8dee-0b979756451b-kube-api-access-m5q4p\") on node \"crc\" DevicePath \"\"" Dec 01 16:15:34 crc kubenswrapper[4931]: I1201 16:15:34.322553 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9a813fa-5946-4d12-8dee-0b979756451b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9a813fa-5946-4d12-8dee-0b979756451b" (UID: "f9a813fa-5946-4d12-8dee-0b979756451b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:15:34 crc kubenswrapper[4931]: I1201 16:15:34.420150 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a813fa-5946-4d12-8dee-0b979756451b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 16:15:34 crc kubenswrapper[4931]: I1201 16:15:34.547108 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfpl" event={"ID":"f9a813fa-5946-4d12-8dee-0b979756451b","Type":"ContainerDied","Data":"264a835b68ac480f7786449a06e02093faf50b5d34723b32d37a80f70c1856d1"} Dec 01 16:15:34 crc kubenswrapper[4931]: I1201 16:15:34.547171 4931 scope.go:117] "RemoveContainer" containerID="25c595277070afbb2fc8501e7a6629c402c7c1e189982a5a16f22389b04b8479" Dec 01 16:15:34 crc kubenswrapper[4931]: I1201 16:15:34.547791 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phfpl" Dec 01 16:15:34 crc kubenswrapper[4931]: I1201 16:15:34.588785 4931 scope.go:117] "RemoveContainer" containerID="e320e03cbed3b6ccf64bfa3d59226ffca2a5fdf6de7dbcd9d269c7076199b9f5" Dec 01 16:15:34 crc kubenswrapper[4931]: I1201 16:15:34.621810 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phfpl"] Dec 01 16:15:34 crc kubenswrapper[4931]: I1201 16:15:34.632022 4931 scope.go:117] "RemoveContainer" containerID="707d280741992ecb4293108e16e3fb87983e488ad90aaf24bef57878dd18f1bf" Dec 01 16:15:34 crc kubenswrapper[4931]: I1201 16:15:34.647253 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-phfpl"] Dec 01 16:15:36 crc kubenswrapper[4931]: I1201 16:15:36.255445 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9a813fa-5946-4d12-8dee-0b979756451b" path="/var/lib/kubelet/pods/f9a813fa-5946-4d12-8dee-0b979756451b/volumes" Dec 01 16:15:39 crc kubenswrapper[4931]: I1201 16:15:39.649452 4931 scope.go:117] "RemoveContainer" containerID="cac391e86c5b6293e162f64a542ad08514661733c1f36c02475665e284023d97" Dec 01 16:15:49 crc kubenswrapper[4931]: I1201 16:15:49.872213 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 16:15:49 crc kubenswrapper[4931]: I1201 16:15:49.872854 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.297142 4931 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-96wjn"] Dec 01 16:16:14 crc kubenswrapper[4931]: E1201 16:16:14.298067 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a813fa-5946-4d12-8dee-0b979756451b" containerName="extract-content" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.298084 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a813fa-5946-4d12-8dee-0b979756451b" containerName="extract-content" Dec 01 16:16:14 crc kubenswrapper[4931]: E1201 16:16:14.298098 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a813fa-5946-4d12-8dee-0b979756451b" containerName="registry-server" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.298105 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a813fa-5946-4d12-8dee-0b979756451b" containerName="registry-server" Dec 01 16:16:14 crc kubenswrapper[4931]: E1201 16:16:14.298119 4931 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a813fa-5946-4d12-8dee-0b979756451b" containerName="extract-utilities" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.298126 4931 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a813fa-5946-4d12-8dee-0b979756451b" containerName="extract-utilities" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.298360 4931 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a813fa-5946-4d12-8dee-0b979756451b" containerName="registry-server" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.299909 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.360540 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-96wjn"] Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.407054 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7x5t\" (UniqueName: \"kubernetes.io/projected/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-kube-api-access-s7x5t\") pod \"redhat-marketplace-96wjn\" (UID: \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\") " pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.407201 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-catalog-content\") pod \"redhat-marketplace-96wjn\" (UID: \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\") " pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.407294 4931 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-utilities\") pod \"redhat-marketplace-96wjn\" (UID: \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\") " pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.509001 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-catalog-content\") pod \"redhat-marketplace-96wjn\" (UID: \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\") " pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.509106 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-utilities\") pod \"redhat-marketplace-96wjn\" (UID: \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\") " pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.509237 4931 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7x5t\" (UniqueName: \"kubernetes.io/projected/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-kube-api-access-s7x5t\") pod \"redhat-marketplace-96wjn\" (UID: \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\") " pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.509589 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-utilities\") pod \"redhat-marketplace-96wjn\" (UID: \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\") " pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.509657 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-catalog-content\") pod \"redhat-marketplace-96wjn\" (UID: \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\") " pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.528451 4931 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7x5t\" (UniqueName: \"kubernetes.io/projected/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-kube-api-access-s7x5t\") pod \"redhat-marketplace-96wjn\" (UID: \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\") " pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:14 crc kubenswrapper[4931]: I1201 16:16:14.660211 4931 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:15 crc kubenswrapper[4931]: I1201 16:16:15.127522 4931 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-96wjn"] Dec 01 16:16:15 crc kubenswrapper[4931]: I1201 16:16:15.951616 4931 generic.go:334] "Generic (PLEG): container finished" podID="2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9" containerID="e7bedb34e626469219f123f943bafe722cb2f22a5e4406e42430733d8768544f" exitCode=0 Dec 01 16:16:15 crc kubenswrapper[4931]: I1201 16:16:15.951724 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96wjn" event={"ID":"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9","Type":"ContainerDied","Data":"e7bedb34e626469219f123f943bafe722cb2f22a5e4406e42430733d8768544f"} Dec 01 16:16:15 crc kubenswrapper[4931]: I1201 16:16:15.952022 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96wjn" event={"ID":"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9","Type":"ContainerStarted","Data":"0d343087491eb8eb9169a3b62dd385a13d4b423bf522c3b0190e40c5e11cec44"} Dec 01 16:16:17 crc kubenswrapper[4931]: I1201 16:16:17.972440 4931 generic.go:334] "Generic (PLEG): container finished" podID="2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9" containerID="0d04ec6bc62ac2a143bb90860971c8a41e23764aa00cee22ea71036482e46c0b" exitCode=0 Dec 01 16:16:17 crc kubenswrapper[4931]: I1201 16:16:17.972501 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96wjn" event={"ID":"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9","Type":"ContainerDied","Data":"0d04ec6bc62ac2a143bb90860971c8a41e23764aa00cee22ea71036482e46c0b"} Dec 01 16:16:18 crc kubenswrapper[4931]: I1201 16:16:18.983635 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96wjn" event={"ID":"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9","Type":"ContainerStarted","Data":"7cbe1edddf75971660f74de1aa847aca2777dc356ca07ed5d0a3e9ae781f61ac"} Dec 01 16:16:19 crc kubenswrapper[4931]: I1201 16:16:19.874127 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 16:16:19 crc kubenswrapper[4931]: I1201 16:16:19.874562 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 16:16:24 crc kubenswrapper[4931]: I1201 16:16:24.660570 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:24 crc kubenswrapper[4931]: I1201 16:16:24.661086 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:24 crc kubenswrapper[4931]: I1201 16:16:24.709255 4931 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:24 crc kubenswrapper[4931]: I1201 16:16:24.729452 4931 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-96wjn" podStartSLOduration=8.156003254 podStartE2EDuration="10.729432926s" podCreationTimestamp="2025-12-01 16:16:14 +0000 UTC" firstStartedPulling="2025-12-01 16:16:15.953439372 +0000 UTC m=+4522.379313039" lastFinishedPulling="2025-12-01 16:16:18.526869044 +0000 UTC m=+4524.952742711" observedRunningTime="2025-12-01 16:16:19.02143607 +0000 UTC m=+4525.447309737" watchObservedRunningTime="2025-12-01 16:16:24.729432926 +0000 UTC m=+4531.155306603" Dec 01 16:16:25 crc kubenswrapper[4931]: I1201 16:16:25.086788 4931 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:25 crc kubenswrapper[4931]: I1201 16:16:25.139404 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-96wjn"] Dec 01 16:16:27 crc kubenswrapper[4931]: I1201 16:16:27.055491 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-96wjn" podUID="2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9" containerName="registry-server" containerID="cri-o://7cbe1edddf75971660f74de1aa847aca2777dc356ca07ed5d0a3e9ae781f61ac" gracePeriod=2 Dec 01 16:16:27 crc kubenswrapper[4931]: I1201 16:16:27.635522 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:27 crc kubenswrapper[4931]: I1201 16:16:27.776065 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7x5t\" (UniqueName: \"kubernetes.io/projected/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-kube-api-access-s7x5t\") pod \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\" (UID: \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\") " Dec 01 16:16:27 crc kubenswrapper[4931]: I1201 16:16:27.776359 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-catalog-content\") pod \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\" (UID: \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\") " Dec 01 16:16:27 crc kubenswrapper[4931]: I1201 16:16:27.776624 4931 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-utilities\") pod \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\" (UID: \"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9\") " Dec 01 16:16:27 crc kubenswrapper[4931]: I1201 16:16:27.777379 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-utilities" (OuterVolumeSpecName: "utilities") pod "2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9" (UID: "2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:16:27 crc kubenswrapper[4931]: I1201 16:16:27.795533 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-kube-api-access-s7x5t" (OuterVolumeSpecName: "kube-api-access-s7x5t") pod "2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9" (UID: "2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9"). InnerVolumeSpecName "kube-api-access-s7x5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 16:16:27 crc kubenswrapper[4931]: I1201 16:16:27.809815 4931 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9" (UID: "2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 16:16:27 crc kubenswrapper[4931]: I1201 16:16:27.878201 4931 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 16:16:27 crc kubenswrapper[4931]: I1201 16:16:27.878408 4931 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7x5t\" (UniqueName: \"kubernetes.io/projected/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-kube-api-access-s7x5t\") on node \"crc\" DevicePath \"\"" Dec 01 16:16:27 crc kubenswrapper[4931]: I1201 16:16:27.878462 4931 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.064874 4931 generic.go:334] "Generic (PLEG): container finished" podID="2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9" containerID="7cbe1edddf75971660f74de1aa847aca2777dc356ca07ed5d0a3e9ae781f61ac" exitCode=0 Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.064915 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96wjn" event={"ID":"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9","Type":"ContainerDied","Data":"7cbe1edddf75971660f74de1aa847aca2777dc356ca07ed5d0a3e9ae781f61ac"} Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.064926 4931 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96wjn" Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.064940 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96wjn" event={"ID":"2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9","Type":"ContainerDied","Data":"0d343087491eb8eb9169a3b62dd385a13d4b423bf522c3b0190e40c5e11cec44"} Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.064958 4931 scope.go:117] "RemoveContainer" containerID="7cbe1edddf75971660f74de1aa847aca2777dc356ca07ed5d0a3e9ae781f61ac" Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.084119 4931 scope.go:117] "RemoveContainer" containerID="0d04ec6bc62ac2a143bb90860971c8a41e23764aa00cee22ea71036482e46c0b" Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.097543 4931 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-96wjn"] Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.103307 4931 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-96wjn"] Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.111204 4931 scope.go:117] "RemoveContainer" containerID="e7bedb34e626469219f123f943bafe722cb2f22a5e4406e42430733d8768544f" Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.172762 4931 scope.go:117] "RemoveContainer" containerID="7cbe1edddf75971660f74de1aa847aca2777dc356ca07ed5d0a3e9ae781f61ac" Dec 01 16:16:28 crc kubenswrapper[4931]: E1201 16:16:28.173340 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cbe1edddf75971660f74de1aa847aca2777dc356ca07ed5d0a3e9ae781f61ac\": container with ID starting with 7cbe1edddf75971660f74de1aa847aca2777dc356ca07ed5d0a3e9ae781f61ac not found: ID does not exist" containerID="7cbe1edddf75971660f74de1aa847aca2777dc356ca07ed5d0a3e9ae781f61ac" Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.173507 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cbe1edddf75971660f74de1aa847aca2777dc356ca07ed5d0a3e9ae781f61ac"} err="failed to get container status \"7cbe1edddf75971660f74de1aa847aca2777dc356ca07ed5d0a3e9ae781f61ac\": rpc error: code = NotFound desc = could not find container \"7cbe1edddf75971660f74de1aa847aca2777dc356ca07ed5d0a3e9ae781f61ac\": container with ID starting with 7cbe1edddf75971660f74de1aa847aca2777dc356ca07ed5d0a3e9ae781f61ac not found: ID does not exist" Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.173585 4931 scope.go:117] "RemoveContainer" containerID="0d04ec6bc62ac2a143bb90860971c8a41e23764aa00cee22ea71036482e46c0b" Dec 01 16:16:28 crc kubenswrapper[4931]: E1201 16:16:28.174143 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d04ec6bc62ac2a143bb90860971c8a41e23764aa00cee22ea71036482e46c0b\": container with ID starting with 0d04ec6bc62ac2a143bb90860971c8a41e23764aa00cee22ea71036482e46c0b not found: ID does not exist" containerID="0d04ec6bc62ac2a143bb90860971c8a41e23764aa00cee22ea71036482e46c0b" Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.174183 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d04ec6bc62ac2a143bb90860971c8a41e23764aa00cee22ea71036482e46c0b"} err="failed to get container status \"0d04ec6bc62ac2a143bb90860971c8a41e23764aa00cee22ea71036482e46c0b\": rpc error: code = NotFound desc = could not find container \"0d04ec6bc62ac2a143bb90860971c8a41e23764aa00cee22ea71036482e46c0b\": container with ID starting with 0d04ec6bc62ac2a143bb90860971c8a41e23764aa00cee22ea71036482e46c0b not found: ID does not exist" Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.174204 4931 scope.go:117] "RemoveContainer" containerID="e7bedb34e626469219f123f943bafe722cb2f22a5e4406e42430733d8768544f" Dec 01 16:16:28 crc kubenswrapper[4931]: E1201 16:16:28.174506 4931 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7bedb34e626469219f123f943bafe722cb2f22a5e4406e42430733d8768544f\": container with ID starting with e7bedb34e626469219f123f943bafe722cb2f22a5e4406e42430733d8768544f not found: ID does not exist" containerID="e7bedb34e626469219f123f943bafe722cb2f22a5e4406e42430733d8768544f" Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.174526 4931 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7bedb34e626469219f123f943bafe722cb2f22a5e4406e42430733d8768544f"} err="failed to get container status \"e7bedb34e626469219f123f943bafe722cb2f22a5e4406e42430733d8768544f\": rpc error: code = NotFound desc = could not find container \"e7bedb34e626469219f123f943bafe722cb2f22a5e4406e42430733d8768544f\": container with ID starting with e7bedb34e626469219f123f943bafe722cb2f22a5e4406e42430733d8768544f not found: ID does not exist" Dec 01 16:16:28 crc kubenswrapper[4931]: I1201 16:16:28.251652 4931 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9" path="/var/lib/kubelet/pods/2a5fcc89-27b8-4d9d-a70a-5ab60e5fa7a9/volumes" Dec 01 16:16:49 crc kubenswrapper[4931]: I1201 16:16:49.872153 4931 patch_prober.go:28] interesting pod/machine-config-daemon-crxtx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 16:16:49 crc kubenswrapper[4931]: I1201 16:16:49.872847 4931 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 16:16:49 crc kubenswrapper[4931]: I1201 16:16:49.872908 4931 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" Dec 01 16:16:49 crc kubenswrapper[4931]: I1201 16:16:49.873819 4931 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0926bec4c83eea074c2b1cb5cba76e7be987db0779aec9324b806c6fcd1a96d3"} pod="openshift-machine-config-operator/machine-config-daemon-crxtx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 16:16:49 crc kubenswrapper[4931]: I1201 16:16:49.873893 4931 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerName="machine-config-daemon" containerID="cri-o://0926bec4c83eea074c2b1cb5cba76e7be987db0779aec9324b806c6fcd1a96d3" gracePeriod=600 Dec 01 16:16:50 crc kubenswrapper[4931]: E1201 16:16:50.001450 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e" Dec 01 16:16:50 crc kubenswrapper[4931]: I1201 16:16:50.335508 4931 generic.go:334] "Generic (PLEG): container finished" podID="daf46d9f-9b61-4808-ab42-392965da3a7e" containerID="0926bec4c83eea074c2b1cb5cba76e7be987db0779aec9324b806c6fcd1a96d3" exitCode=0 Dec 01 16:16:50 crc kubenswrapper[4931]: I1201 16:16:50.335674 4931 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" event={"ID":"daf46d9f-9b61-4808-ab42-392965da3a7e","Type":"ContainerDied","Data":"0926bec4c83eea074c2b1cb5cba76e7be987db0779aec9324b806c6fcd1a96d3"} Dec 01 16:16:50 crc kubenswrapper[4931]: I1201 16:16:50.335734 4931 scope.go:117] "RemoveContainer" containerID="9a361845d1da44cb7b29595626124fca03becd0f451d39200eda9b78eee72f1b" Dec 01 16:16:50 crc kubenswrapper[4931]: I1201 16:16:50.339492 4931 scope.go:117] "RemoveContainer" containerID="0926bec4c83eea074c2b1cb5cba76e7be987db0779aec9324b806c6fcd1a96d3" Dec 01 16:16:50 crc kubenswrapper[4931]: E1201 16:16:50.340489 4931 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-crxtx_openshift-machine-config-operator(daf46d9f-9b61-4808-ab42-392965da3a7e)\"" pod="openshift-machine-config-operator/machine-config-daemon-crxtx" podUID="daf46d9f-9b61-4808-ab42-392965da3a7e"